var/home/core/zuul-output/0000755000175000017500000000000015150464724014535 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015150471372015476 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000242634115150471216020264 0ustar corecorerikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs,r.k9Gf4eEڤ펯_ˎ6Ϸ7+%f?ᕷox[o8W56!Kޒ/h3_.gSeq5v(×_~^ǿq]n>߮}+ԏbś E^"Y^-Vۋz7wH׋0g"ŒGǯguz|ny;#)a "b BLc?^^4[ftlR%KF^j 8DΆgS^Kz۞_W#|`zIlp_@oEy5 fs&2x*g+W4m ɭiE߳Kfn!#Šgv cXk?`;'`&R7߿YKS'owHF6":=3Ȑ 3xҝd){Ts}cZ%BdARO#-o"D"ޮrFg4" 0ʡPBU[fi;dYu' IAgfPF:c0Ys66q tH6#.`$vlLH}ޭA㑝V0>|J\Pg\W#NqɌDSd1d9nT#Abn q1J# !8,$RNI? j!bE"o j/o\E`r"hA ós yi\[.!=A(%Ud,QwC}F][UVYE NQGn0Ƞɻ>.ww}(o./WY<͉#5O H 'wo6C9yg|O~ €'} S[q?,!yq%a:y<\tunL h%$Ǥ].v y[W_` \r/Ɛ%aޗ' B.-^ mQYd'xP2ewEڊL|^ͣrZg7n͐AG%ʷr<>; 2W>h?y|(G>2m`QɢJ[a|$ᑨj:D+ʎ; 9Gacm_jY-y`)͐o΁GWo(C U ?}aK+d&?>Y;ufʕ"uZ0EyT0: =XVy#iEW&q]#v0nFNV-9JrdK\D2s&[#bE(mV9ىN囋{V5e1߯F1>9r;:J_T{*T\hVQxi0LZD T{ /WHc&)_`i=į`PÝr JovJw`纪}PSSii4wT (Dnm_`c46A>hPr0ιӦ q:Np8>R'8::8g'h"M{qd 㦿GGk\(Rh07uB^WrN_Ŏ6W>Bߔ)bQ) <4G0 C.iTEZ{(¥:-³xlՐ0A_Fݗw)(c>bugbǎ\J;tf*H7(?PЃkLM)}?=XkLd. yK>"dgӦ{ qke5@eTR BgT9(TڢKBEV*DDQ$3gFfThmIjh}iL;R:7A}Ss8ҧ ΁weor(Ё^g׬JyU{v3FxlţȎGxꆜdݤ߯R> kH&Y``:"s ayiBq)u%'4 yܽ yW0 -i̭uJ{KưЖ@+UBj -&JO jKi0>,A==lM9Ɍm4ެ˧jOC d-saܺCY "D^&M){_$4+CN(44iVz- 1 EaE nQ Ӌ_kckh>F L+ *nacԇ&~hb[nӉ>'݌6od NN&DǭZrb5Iffe6Rh&C4F;D3T\[ bk5̕@UFB1/ z/}KXg%q3Ifq CXReQP2$TbgK ء#AZ9 K>UHkZ;oﴍ8MEDa3[p1>m`XYB[9% E*:`cBCIqC(1&b f]fNhdQvݸCVA/X_]F@?qr7@sON_}ۿ릶ytoy͟מseQv^sP3.sP1'Ns}d_ս=f1Jid % Jwe`40^|ǜd]z dJR-Дxq4lZ,Z[|e 'Ƙ$b2JOh k[b>¾h[;:>OM=y)֖[Sm5*_?$cjf `~ߛUIOvl/.4`P{d056 %w ^?sʫ"nK)D}O >%9r}1j#e[tRQ9*ء !ǨLJ- upƜ/4cY\[|Xs;ɾ7-<S1wg y &SL9qk;NP> ,wդjtah-j:_[;4Wg_0K>є0vNۈ/ze={< 1;/STcD,ڙ`[3XPo0TXx ZYޏ=S-ܑ2ƹڞ7կZ8m1`qAewQT*:ÊxtŨ!u}$K6tem@t):êtx: `)L`m GƂ%k1羨(zv:U!2`cV, lNdV5m$/KFS#0gLwNO6¨h}'XvوPkWn}/7d*1q* c0.$\+XND]P*84[߷Q뽃J޸8iD WPC49 *#LC ءzCwS%'m'3ܚ|otoʉ!9:PZ"ρ5M^kVځIX%G^{;+Fi7Z(ZN~;MM/u2}ݼPݫedKAd#[ BeMP6" YǨ 0vyv?7R F"}8&q]ows!Z!C4g*8n]rMQ ;N>Sr??Ӽ]\+hSQזLwfm#Y~!%rpWMEWMjbn(ek~iQ)à/2,?O *Xa>EE衢^}p/:F?}bi0>Oh%\x(bdF"F 'u Qx`j#(g6zƯRo(lџŤnE7^k(|(4s\9#.\r= (mO(f=rWmd'rDZ~;o\mkmB`s ~7!GdјCyEߖs|n|zu0VhI|/{}BC6q>HĜ]Xgy G[Ŷ.|37xo=N4wjDH>:&EOΆ<䧊1v@b&툒f!yO){~%gq~.LK78F#E01g.u7^Ew_lv۠M0}qk:Lx%` urJp)>I(>z`{|puB"8#YkrZ .`h(eek[?̱ՒOOc&!dVzMEHH*V"MC Qؽ1Omsz/v0vȌJBIG,CNˆ-L{L #cNqgVR2r뭲⭊ڰ08uirP qNUӛ<|߈$m뫷dùB Z^-_dsz=F8jH˽&DUh+9k̈́W^̤F˖.kL5̻wS"!5<@&] WE\wMc%={_bD&k 5:lb69OBCC*Fn) u{Hk|v;tCl2m s]-$zQpɡr~]Si!ڣZmʢ鉗phw j8\c4>0` R?da,ȍ/ءfQ 2ؐfc}l 2窾ۉ1k;A@z>T+DE 6Хm<쉶K`'#NC5CL]5ݶI5XK.N)Q!>zt?zpPC ¶.vBTcm"Bsp rjﺧK]0/k<'dzM2dk–flE]_vE P / څZg`9r| 5W;`.4&XkĴp 6l0Cз5O[{B-bC\/`m(9A< f`mPіpNЦXn6g5m 7aTcTA,} q:|CBp_uFȆx6ڮܷnZ8dsMS^HэUlq 8\C[n膗:68DkM\7"Ǻzfbx]ۮC=1ÓOv$sY6eX%]Y{⦁# &SlM'iMJ았 t% ~@1c@K?k^rEXws zz.8`hiPܮbC7~n b?`CtjT6l>X+,Qb5ȳp`FMeXÅ0+!86{V5y8 M`_Uw ȗkU]a[.D}"\I5/1o٩|U戻,6t錳"EFk:ZM/!ݛ@pRu Iヵvyne 0=HH3n@.>C@{GP 9::3(6e™nvOσ =?6ͪ)Bppًu_w/m/0}T>CUX\!xl=ZVM\aٟ6h㗶E۶{O#X26.Fٱq1M k'JE%"2.*""]8yܑ4> >X1 smD) ̙TީXfnOFg㧤[Lo)[fLPBRB+x7{{? ףro_nն-2n6 Ym^]IL'M+;U t>x]U5g B(, qA9r;$IN&CM(F+ hGI~Q<웰[, qnriY]3_P${,<\V}7T g6Zapto}PhS/b&X0$Ba{a`W%ATevoYFF"4En.O8ϵq\FOXƀf qbTLhlw?8p@{]oOtsϑ`94t1!F PI;i`ޮMLX7sTGP7^s08p15w q o(uLYQB_dWoc0a#K1P,8]P)\wEZ(VҠQBT^e^0F;)CtT+{`Bh"% !.bBQPnT4ƈRa[F=3}+BVE~8R{3,>0|:,5j358W]>!Q1"6oT[ҟ^T;725Xa+wqlR)<#!9!籈K*:!@NI^S"H=ofLx _lp ꖚӜ3C 4dM @x>ۙZh _uoֺip&1ڙʪ4\RF_04H8@>fXmpLJ5jRS}_D U4x[c) ,`̔Dvckk5Ťã0le۞]o~oW(91ݧ$uxp/Cq6Un9%ZxðvGL qG $ X:w06 E=oWlzN7st˪C:?*|kިfc]| &ب^[%F%LI<0(씖;4A\`TQ.b0NH;ݹ/n -3!: _Jq#Bh^4p|-G7|ڸ=Bx)kre_f |Nm8p5H!jR@Aiߒ߈ۥLFTk"5l9O'ϓl5x|_®&&n]#r̥jOڧK)lsXg\{Md-% >~Ӈ/( [ycy`ðSmn_O;3=Av3LA׊onxlM?~n Θ5 ӂxzPMcVQ@ӤomY42nrQ\'"P؝J7g+#!k{paqTԫ?o?VU}aK q;T0zqaj0"2p؋9~bޏt>$AZLk;3qUlWU Ry==+mj(^>c/"ɭex^k$# $V :]PGszy!EZ]DaUS@''mhSt6"+ҶT M6rN+LxE>^DݮEڬTk1+trǴ5RHİ{qJ\}X` >+%ni3+(0m8HЭ*zAep!*)jxG:Up~gfu#x~ .2ןGRLIۘT==!TlN3ӆv%#oV}N~ˊc,_,=COU C],Ϣa!L}sy}u\0U'&2ihbvz=.ӟk ez\ƚO; -%M>AzzGvݑT58ry\wW|~3Ԟ_f&OC"msht: rF<SYi&It1!ʐDN q$0Y&Hv]9Zq=N1/u&%].]y#z18m@n1YHR=53hHT( Q(e@-#!'^AK$wTg1!H$|HBTf̋ Y@Mwq[Fī h[W,Ê=j8&d ԋU.I{7O=%iG|xqBչ̋@1+^.r%V12, _&/j"2@+ wm 4\xNtˆ;1ditQyc,m+-!sFɸv'IJ-tH{ "KFnLRH+H6Er$igsϦ>QKwҰ]Mfj8dqV+"/fC Q`B 6כy^SL[bJgW^;zA6hrH#< 1= F8) 򃟤,ŏd7>WKĉ~b2KQdk6՛tgYͼ#$eooԦ=#&d.09DHN>AK|s:.HDŽ">#%zNEt"tLvfkB|rN`)81 &ӭsēj\4iO,H̎<ߥ諵z/f]v2 0t[U;;+8&b=zwɓJ``FiQg9XʐoHKFϗ;gQZg܉?^7V<*EzfH{]:*6M x-v쳎M'.hO3p-IGh ܆hR ]zi2hB9'S_;I/d0oIU:m/~[*K1QA="D:V&f:{7N>^uU` c/X)mS5KC߄":{H)"%,!3w{"ZWÂk>/F?RJ>FIY*%5Hg}3Ď89؟N/pgÞ tJXB-Gjsٶ 3Gzp؍H|*cyp@\첹,[up`uV,\KCB\qGiW痃[?i?S{eϻl71X:݌>EEly(*SHN:ӫOq{{L$?Q{϶(F_Ej>3mqfΤP-j)H˧&8?a?2xĐ+EV؍x0bv6 fd1^ 2ӎԥ sZR cgu/bn/34'h9Dݥ:U:vV[ 'Mȥ@ەX㧿-p0?Q6 y2XN2_h~Cֆ֙82)=Ȓ7D- V)T? O/VFeUk'7KIT, WeՔ}-66V؅ʹ;T$pZ#@L; ?0]"2v[hׂ'cJ6H4bs+3(@z$.K!#Šj2ݢxK-di +9Hᇷ絻+ O.i2.I+69EVyw8//|~<ëng)P<xͯ~? fp,CǴ_BjDN^5)s('cBh+6ez0)_~zJz"ё`Z&Z![0rGBK 5G~<:H~W>;ٍVnSt%_!BZMMeccBҎÒJH+"ūyR}X~juPp- j\hЪQxchKaS,xS"cV8i8'-sOKB<չw"|{/MC8&%Og3E#O%`N)p#4YUh^ ɨڻ#Ch@(R &Z+<3ݰb/St=&yo|BL,1+t C<ˉvRfQ*e"T:*Dᰤ*~IClz^F6!ܠqK3%$E)~?wy,u'u() C>Gn} t]2_}!1NodI_Bǂ/^8\3m!'(Ֆ5Q&xo 8;'Jbo&XL_ʣ^^"Lq2E3,v1ɢu^}G7Z/qC^'+HDy=\]?d|9i,p?߼=\Ce"|Rݷ Q+=zxB.^Bld.HSntºB4~4]%.i|҂"? ~#ݤ[tfv3Ytck0O ͧ gP\|bЯ݃5H+v;6q6^9.EPHŽ{pN>`cZV yB a[s׭dֲcUh=Ɩ9b&2} -/f;M.~dhÓ5¨LIa6PnzɗBQiG'CXt!*<0U-(qc;}*CiKe@p&Em&x!i6ٱ˭K& FCfJ9%ٕQ·BD-]R1#]TROr}S [;Zcq6xMY 6seAU9c>Xf~TTX)QӅtӚe~=WtX-sJb?U'3X7J4l+Cj%LPFxŰAVG Y%.9Vnd8? ǫjU3k%E)OD:"Ϳ%E)=}l/'O"Q_4ILAٍKK7'lWQVm0c:%UEhZ].1lcazn2ͦ_DQP/2 re%_bR~r9_7*vrv |S.Z!rV%¢EN$i^B^rX؆ z1ǡXtiK`uk&LO./!Z&p:ˏ!_B{{s1>"=b'K=}|+: :8au"N@#=Ugzy]sTv||Aec Xi.gL'—Ʃb4AUqػ< &}BIrwZ\"t%>6ES5oaPqobb,v 2w s1,jX4W->L!NUy*Gݓ KmmlTbc[O`uxOp  |T!|ik3cL_ AvG i\fs$<;uI\XAV{ˍlJsŅjЙNhwfG8>Vڇg18 O3E*dt:|X`Z)|z&V*"9U_R=Wd<)tc(߯)Y]g5>.1C( .K3g&_P9&`|8|Ldl?6o AMҪ1EzyNAtRuxyn\]q_ߍ&zk.)Eu{_rjuWݚ;*6mMq!R{QWR=oVbmyanUn.Uqsy.?W8 r[zW*8nؿ[;vmcoW]"U;gm>?Z֒Z6`!2XY]-Zcp˿˘ɲ}MV<в~!?YXV+lx)RRfb-I7p)3XɯEr^,bfbKJ'@hX><[@ ,&,]$*բk-Yv5 '1T9!(*t 0'b@񲱥-kc6VnR0h& 0Z|ђ8 CGV[4xIIWN?Yt>lf@ Vi`D~ڇŁQLLkY <ZPKoma_u` !>Z;3F\dEB n+0Z ?&s{ 6(E|<ޭLk1Yn(F!%sx]>CTl9"و5 |ݹր|/#.w0ޒx"khD?O`-9C| &8֨O8VH5uH)28 Ǿ-R9~ +#e;U6]aD6Xzqd5y n';)VKL]O@b OIAG Lmc 2;\d˽$Mu>WmCEQuabAJ;`uy-u.M>9VsWٔo RS`S#m8k;(WAXq 8@+S@+' 8U˜z+ZU;=eTtX->9U-q .AV/|\ǔ%&$]1YINJ2]:a0OWvI.O6xMY0/M$ *s5x{gsəL3{$)ՆbG(}1wt!wVf;I&Xi43غgR 6 ݩJ$)}Ta@ nS*X#r#v6*;WJ-_@q.+?DK១btMp1 1Gȩ f,M`,Lr6E} m"8_SK$_#O;V 7=xLOu-ȹ2NKLjp*: 'SasyrFrcC0 ѱ LKV:U} -:U8t[=EAV$=i[mhm"roe5jqf$i>;V0eOޞ4ccc2J1TN.7q;"sդSP) 0v3-)-ٕAg"pZ: "ka+n!e߮lɹL V3Os\ဝ+A= 2䣔AzG\ ` \vc"Kj61O Px"3Pc /' PW*3GX liWv-6W&)cX |]O;C%8@*Z1%8Gk@5^NtY"Fbi8D'+_1&1 7U^k6v읨gQ`LRx+I&s5Www` q:cdʰ H`X;"}B=-/M~C>''1R[sdJm RD3Q{)bJatdq>*Ct/GǍ-`2:u)"\**dPdvc& HwMlF@a5`+F>ΰ-q>0*s%Q)L>$ćYV\dsEGز/:ٕycZtO 2ze31cDB/eWy!A/V4cbpWaPBIpqS<(lȣ'3K?e Z?ڠ8VSZM}pnqL f2D?mzq*a[~;DY〩b𻾋-]f8dBմVs6傊zF"daeY(R+q%sor|.v\sfa:TX%;3Xl= \k>kqBbB;t@/Cԍ)Ga[ r=nl-w/38ѮI*/=2!j\FW+[3=`BZWX Zd>t*Uǖ\*Fu6Y3[yBPj|LcwaIuR;uݷ㺾|47ߍeys=.EinE% 1zY\+͕߬VͭW_겼cazyU1wOw)Ǽn@6 |lk'Z|VZpsqL5 څB}>u)^v~,󿴝} 3+m𢛲Pz_Sp2auQAP*tLnIXA6L7 8UgKdT)*7>p{Pgi-b)>U6IXabPde Ӽ8Ģ8GɄnb'G ֤Mcv4?>HC78NE@UMc8>`TvZ:}O{6} %ޚ6qc;>v05"뷊dɱCZĊIt4VAq=J"2.cy2P"0~"ڔ^Yez'͗<\ U*D!E\j,".CS8TE)e"2OZH`EPeofQr< 'CU 7%q*SZepJ,y ]oՌ P:,nýH~ 77WLu&1L . >#Sp!OBӥqxDpLtӼyZDpǙׇ~ اy|}+D]9y锔qy))))uř2t #xn|L#t_+c/C=4Wp#˕1 ;csuwF yqe1Cڑe4ǧ4"I_ЅХCن04{fy5gaN@~wSϖkKU&RȘ;"m|6>g;uŦnx6kG)E3[uA [@x-b8޴<>m1-q-@79A.c>r%PX2a:?_eK"`N.,*/_Pa9xyYΚU- ֺby:D8V ;L-醔*D]4~ QJIgmw/lU`~tg$(Bj b%䓨j/q'm,\MMkкBpS..oy_8W_Q0RN'y!A%D'- %fod3W^BB& Aw?ߞSŇXB (Oww?eJh+Vx:'A1 >vXǑP2wVM8luLm5 ' ?Ul&Qo0Kr=jSLO*Nᄉ:[|j^0Na]TiEΔ ҾJ|&0(fjIi}B(WJ,/z)wA!:c}@RUݸk$LmwDm^WRhuVHEҰJ_n拏&UuᣀG(:^-шE]<'cQEsh*U]dVLt RVk4q$p3tqx2{t8Ǽ&}4aD X#)Bh~`3I NfRSV*Lew)q*5K5KwΎY=Ծbq`Os:$J[I(uϾ,鰎鐋e\w)W c0 "f0Fk[YLl |ȁga4> 4@}> HOrKLx*˜">aFo@ D3˛+ q6UJk-J0yh#A+D)0up^i\ApHc4RT\Ie<2]8Qvb 5^UlhB_#ɆA2sPڂcb tA:7}?0O%"Zuza&fYD:X,}rA0Xw#!#^'1(qX^Dci kb$ ޵{8rc&.V<+wjf z( A ;LJ`"/710h#c(ɦy (`L`NqL[=Rxt91 9bys eK>mG%iXKzv -*?Ч6^`Bե2G.FZ< U \K2W/Ehq&nkx6U3Ҕcp ߜ75LgjW !/^(T"wzl-΄/ZXt=}?QXPe)GѬ$=GЭs+Z[WcQ*+\*iX+ cA`W(#Cd^*h $ 'өCəbX+An8==_ҧdzg40ޓeax]:AdD,(޷fFlg~FSؐ*ђ ?$M@aSٻm-W0ݽAz;ƉLP$eH-)z]`bn4HxS.h3]qM"=x~7:ݝOx)u4̊|J%dcuZQG~~۪pǾ=YRĵק ㇊WCqC8t58@'pɲ6w <=wGã?@vuD\=,z 5<3 (*~/,aA9O!q>9" ~ʳf(gQ.ySaN *앭pszϭTi} ]i3{rGmµJPXOn%F N('#0u[0[ 40RucwWix ž#_ɍi?C,kl؂a7(+"b0]rDH> g* 5TzH }灝'GxC0#5& i?Jз4 @N.͇@֟0pkdC6ba-4|a;- N*!8O+gwGn}ClYI]w'8L(EZ4@/͆`<]W/M9 h(C ]BXF7 ]'P#S^ucvӅ1^rи* 4}3եQ eRO(֓7vudžiDΨuX * ao?餜% k`sbT+I]wNf {vG^fiwbӰ:f9v"; 3I& Ց}YIvfV`,$ Ơ'NL'%mGN̦=yB zV^.L'dl_ {$850ܖݹ ֯ mfhHw0[Cm|@{t v XO;(s߻ 0~2.}S {x_j^v ևXrN<3뉃7s4Ѿi| {tFvlu21:^JtCCq4ɡHU Mפ,^Hم}^=QtG .W3#w5,+HmAb5seUA:0fq5'M/yG]"hHb7q5Pc !ٗYھImv "ώ(# * -25bBY!NB`; k2/a .JԬ́)Ryz%@p2pF=v~,+2)16f9o7M^|fИ&P5a(j-" ^6[y5Ācul9zf<܆#A oAE4(`=,맣Ĩ.?+e2.6fGF9$&EvKO | f";0ɋ/El .e4$\c" XmN{Wf "|9Cl ]#EQq.RX V'ky Gz>pLe`.|X F=Tbqznˢ~Y}ш`'d0Ӣ9!"Kl0,yQa=l|[Q(B` _='w( G'"01bk4h%sT6xgrη\'+HMt$qގ%`ȇy6M5Ub]WhՓNaw:Ʊ~l~j#'Y:Jhf7Û4v;fRפZ&@]Qf\>Q7 FLG@*p]K-|!R5twtI@~=;%~/i^|:d'nҎWZd5WT4d_jj`wQe Hpa1bjhu孵JG9V k8 =b;똜6߷k[0#H] (1;j5/dzw:<8bQv}5+Ҳ:90đ`WϮjX&^N^hxWΆ6lZVw3rs-aє'ӓb\FL*fpJo[4:W^yq=?yXe^s=(k]Vo`m1jep&d-o,'Qqd N4cta3p Ôpt(!&8afl4B慉fDg1b.Jɔoc:p款=)eq-4L@+mtڲ1zh*d\}a`6X-mlw[ۏf2{6ע (q%1ʏ4.2X-}T,gvZJz"6Zϧ 6j<{YE2Y6)DbŔf È\$YkTVbr8xu̾h?LS԰G r(41#q"`c hEl E6#8ԯ<.٥Uh >J( #U>fikRX>{O[[hD?zt0Bvkkw9iQ,0fNg+8 j?WMm͞<pOB-.9bB3j\ΐPL6z|gF]$4ˢxʵzpBfm>.s\h=s݌IvM̪UF: tÜD/h4{MKwY8/]ݔ?9H댼ɋ~ywA\ BGmI1Ab?N+Gƅs8+:ʘTƳ0Ps?q:KKtѰ!U˓ -"[xũebjԳ,?ɬ=]t~Yyc4;MbK2>߳dNF10kvlT4sxaȐ#NR-o `PO$&=Z8pBPqY[)΍Fdab#(/+Y. -k."嶔0¥ 6Ese`?b}, 3nLE+KZ\:u}a,edGg0gʔڬd9| --)ͮiV~@*M" `ПY_l{3M07A5 ; ߃o~px4c4t\P~]I/Ӽ8ϒIOC__(A;b_H<0Iz"FcN>0`+@p *w 0-٧]YKf`mS4AiZXtpʮrg'񈁛,e2nS\e 1'EN<K(KGVsv5SnၾA獥5pPa1#wEbNQfDžkC>,hc" 52f~.pj59{`P^L2vLi}J9_D^Jw[9SsAUHQA ֬A$Szͽp%2_^c9wrT kXtu\W<Ӏ 唥CW>|B,=da8~1O|r'G>F!6Z{ؓ|wjǢG{2Y7%S&%[F)RfA doƃEWj^Чj`+fn dN|ȸ_`ObHn3=ܠ* .U/7|F)o-XAmY}1zhý󇐔vڐ`cV٬CDmk#.w:LڐqFSgIu<N\!-n*<=TJ6mTD{EK3=HCU%TRh& ij~[uP6F_Ę,RJCF^ʭ[$vq-r S7ߺ=?jݚl"x[#+R!e9<@+]ݫ ?Q2ZTj5( |)o]HB^g쑗.N)xQUѤJK\ ;>7 /8 Ҷ;av`oOUx<(cQ_M-M#]CKM_^F]ĥ[e.% |xuA5BGbr]Q#2l=t*4fr| I }=[L)G:V_Q K2BʶL@Ӹyh*)N{+P8EIiXgO}(tp9D=tN⊈pmN{"֕;8Μ_=Hc8kO B/u|k4Gthpz62=WK- bޖUp"fΦVmj&> iU7OzXݓ;̇aL Bdq5_P&ỵG'4n]I^YV beU,R"ɛrVr9ēS?] u m)!{9>Oߘ}sBm]% Tǣ z͎IԸ p8 m󵐮,N8d!).đrJa!Jy$%!bWOV(YBO`9n$t櫯VĆy|ۅШ; atZ:7g,5YziGI0PVҋƌiΑ{ 1[6rv[ fߦXImAy_ >Ow|C>KpQy# Kz'܇HBہB뜖1$7o,'> Diݾ6RZ/kv+ sMBr7ޗ׎_v1&i}3P+Bwudd;JԀ_lC‘eLVD#4tЛXWZUnE_HỳGn;[%ݒ#Żyq'`딗г ``A5dROS0$aNJ9L. KmxK/aZHK]l`ە aV20Kii t/ [դB5_k#E2!m}`jYpsb5>b "!u|*kh8R$h|,81GE71hz&Gݕ{x1L۠K Lb3Eu]Bה#E.K\,bj:-45dk28Rv=*P$fbzeueeڤg!{QR4%(iesP;8~Iٕ :u&7tKhQNI'8wyQ+;t x7;~A ^v3 p\1 K[ UV m NB6R@V#=8_L?R$C_`{5¡L\#BEYmH7E8Z8Ss/t onOIs4(1;}sܛC&>j5zyt BsHgLdP{TOdFO8zYrdU ]J<1hH:9{,9pb"&Z-YEH-qMjD>$خaMw_]/k0aR,<$FXktt0%NG;g덏#'n[07Uщ A4,$"_1Gc.+}VYp=6b'9N^D/4`]l:CU912N9jÎT2nwemxqyu Hkh!QUKIRɼɹs#kA @X\;9׺5 A ;3`HßZGO-_V7|ŮpzP91 qR!E% R}Gi+Jmו8Н *B4~׾_ ~xaZ|\7DhM m+d=V &4cQhAd~>˘莭: O/}]!mE2{kmri|Q"c}karkwB"-j1%lb~ P$+]<6_ ?R;FCnaG V;}&ʴKhAI_ 2g~__`- 4QcYhPY@0}ȲT+@$3/} ]w QiaCuܚhlErRaȖqڅ M?\89\?}⎅mc(uo'n8ihǑ’C~rXk@:Uj RJbAi,3 Qmk5$ƹ. ^1aO@+56yDCtHQn9&S~6xF*V!@RT[jKbtQ$ ]9VَVSBJ]b ƙ"=bO?֣5(M^j_yخKbsG3f-оJ?W bLrzln,0y ג!eAmԼW1$HI=9̮T-jZ#)Hy$ X';GI'Gg/>r)7;/w9r( !x3>A7yH;:.j~D?]r2&'gA_(K>CFu46l,^: 5/zЇyѷP=g}B1P0/腘WHbԃ^rhw3A<[e]J|cvמ;f_dιP&)-ԅ'2gQX E {{{Q!sG8'0$w7'c 3cABVw[U"0fe|@7y>+ J\Ԉ˔I•IF!/bOdjbQa$eD sꦱ3QMFeIe7x " [h(;Is/Ԃy¤Rv{ 8{\ Mڋ;tqv`b~y8ƔʴmYfeդB5O]@ĺ1`/je[{{^,h8VkqGy~޸7WɃ$31p^6L-BOaHNgnbs4/ܪu.{e %c+&sir-AcDVRlU$E8zhE9Yb |Uo, gZe\Hhzӟr2m,Q^Xn @I ӛuEѯ8T҅ۮl2ytLB/6 ۿƼL?ŭּDK]l~\.&GK(7,8NeuRv5\|Bc`hmyѢv)XV^>EnܲJډ",Nq))vEŀ|8lx 9PE <|Z0XgS70~xٷfUx1Nf3 d_fܫT[fqdڍI$#ע=nnOOnKy#"Ί-nvjDad2s+I-,8H MqrH!>! q9p4&g\v'J/jh2h}_iVÂU{Wg4 ϥg)cM.[/?6Ikʂdބm2%:P!J" D?+CX)H&e4!ǾݲX<4ӗ}X.wϒG}o7&oNy =:Ta=DaE5'(NdM Hz Y'[j:%oEMryz=[ 2f}ZrFɛeW8=QX-FQb &Nn o袀;w2}eyWuNf.Gt'oKdJcmZl6E1{~x'.CM¡2Wk:zvM&N>ZA`Y&7~P/`Ǩ܂k5UIYUStL<<sQe%UN<=l,& ED6Gxr1*F쬡H1do ^?yx&_[%Ȃ޻OW,M}~|D}NyXX)?O,<[~rN.)0N ڢʮ *>E\e.舗lR* :;T[4=פ8)c(c<G˯9sÏ !hfЅZ6_~+igc) -޸/);3uYu:p(uJ+*/E)ca򘽏ܹΣedEG.j8BFRӸ.ϝg[%gV-*Do 0}[T&Y^K:Վ>-9pIH:,,>v񏜋)!.}Ԟks,88 ٛutj@˽a8MҼ1y E 'IP}̃Ă?ݶ^X8f]1к`( gAHI$&z.׬{+0%w1ᬂPwq<'`tcT6=[pxI_@%#DS#8Vr?B̓]4wgS~^(׃fksλ?.06ҩ{|/ޜ7VIeezD!_]:[@H|\?<@Μ;.ⰋTMɨg ?o@Ʈ8~NzsX%i$x L8 U4NW!Ͳi粇'ΥvɆy ?u\|M$Rw((դjm3#3*g bdzQ$2 c2ZXTW26M?}ꎯ鱋&hN! 4} 8 B`pScK3R,qm{s3 ^  _=u>8{gY?ɏ+1΃s]'c덯#žÀ"N5.w/֊-S"P؝S&IG'>8$\WEN䀠J_?_ʋsi޹oF''5@0~ CЌGYD@hcSyߙJ#;ٷO»ώ^Gx53>q*"Hm$Rx8wE|\@.\']58dQI4 *Rv+HLj& H@k˾nGUuḃu] fzˤM.5&mTl#:nOMè5hNI2u``ћ`Nhi}\Yʪ5s}D%aXja#3V&չZ9 oQ3:**Y5ֽ*R;hknK]oG<\9SjyPuŕNA$'ڝ F~j Q!=|'jw|6x|4p IG$"Áw"w=`WV$gW.e|`+2FJO2ņ/-U?ALifw='^D-/{tWRP#H#Ͻr {Pi)X%wI"V y삐Ra\o夨q כr^Ц@oY/熚l˚xTTN/4L~EDTہ%Y{IA8'% E KiCMTkLKjvX>"۶>S YRT2vĺ߆=ϱ]UI4HsAUԦ{򅟷T5l)bf 1Z}m}\GL -6yz>1RO5v|-(eOnÉ~M˱z]X_VJl!t [5'Z<|/6.;=pd&pxϣ|24?6^U~1!vړȔN2ф]8 H+8I\E"`[B2ɃL|9-ϢRl:Qq^9u,&$ Ŝ) 'ܲqGZ~׸"VƪVN]eWW-n2E2 6hrS&[9b^WmӉڦo1Ed罽o{BP+=M0JIR4ذ֤h(mI@Z#EQ/)b!-Q{YQ#>p'~\UijИ^{vL^ zMAvYh`Y8gdc6qVo\&ʝ{d> ND5H|A|s X~ǿLn^hT 9۰8qC{IB !XvW"6E5]^5m`.Cw ~|lVT6ƺ3ֿOߝN}4/YQɘzvy|,}Wuw;.I J91RH7] ighF %32eK8f{Pծa-1r4=Tru $ȃ)WNyBNXR%iΔ7.6BSm7GN䪄q Xbą@֬,yu%^hsS"D6*kA>rX)w@%e-ՂVעɆ:݉ 6Jk\3Xdn8y8jHoVC۶9cTÒo3R7»X sv[~ff5!G7ޯ1߫eW Ǹ#Ӂ=Z U}EdzFrJ;t6*uwan? \QX\ox@V"{ZEzC Dr#6 ,[j+^GԒV]ޏ[GQ%3Q0QY8wa}q_p |Pfqv.Gmlx0,=&?OW Iel]4}%FT874p*V r9t!9Q5cїq$naeQbq慷ĉ$a[J !]SZz,-޻\0*P  9701Kd6i.M.z p]PHkwz"4jr ^DJD)Yby2ۨLN}IA((@7yDPI9ɍit"e %+Aw{kZRh&֌:=8'07{N+L[ei=5PU<$,>\E֝sBtJI)cs%m4T_ӪECZ3Tu&A/p<f LHyR|)@Qgkv3c8/ϡc [&JC쾈VSw"'7_"'Ԡ"H}5icLj2 +B%tN(1'"ɻFKZ[4mUE,z< @l$ :cb ͕͙ ܱܺ\kS"%bjY}yc,oֵ'+eQ'j>Z i 3dx5QגpȚkmH ?HA9xD.)9߷CHj8lK&WUWw0 %>S>beS䰠bl76[4IRC]zoa1M6|'4F)C0-cD.9"!%YRe10c'jI Z+^av8c^aϞW0^ARU nm,A~n|hsKpL\86W6/iVxt'U֨d:ݎ;Ery6vr{IJ]`'B@ƈ1ۆkwcId0D"8ue4C[XLj2VkP^4Hn3Z{cckyjDC*̂AIFLlD8TLwJsҘg"B}dWL˜( C8VTe)>6b*8T 3TTXWrʒHrqwL9SM6H !d,"Wo%ԁCEN'$&bJ9 TҲJŪߌXz T{&.8RL b jVՠ8x5ҊBZ5*{!9dm5(\&}sblo?.(1nc?©.f8:깬~4n1QoWRJOC+KnNgWGnon_ѳ 'ipgWN \9F/dM }DHAۛA]TTV'HLfDt6H`Nht6P\&+5P29о 8(CXlAbj0 X뉄( ]=+JAryf|W(iȧ0YUUAb`F \u!@q E)KTѬ~*a] lx}R4u/5L G_D Sq]y0AbQj>CAsVZj= :;[|s:d8Ù߈{@A?EaOnF\m#L1q1ps2wb.H`򿕟n·+ќ,55w[ W,gd& hT{pBmCgGJFI%QFmxv.u/iW?Q|eF;7}?rxv u8{ٛAt;}ϮsѼƇ.GqŶWh _ttg8tBi;NpDPܾJ&MgQͬaI]8ڋ5JmF iDϿM4أ-{l{@ isFW mx[՘wE #ݙLwOSݿOؿ|Wa.ldg^ظej5ץ|_߹_??|qh8$uOsM:.1b_O"mn{M?_/~4}i!˸MA=pϋ_(?:; G;:; O=_ ][Av|3A O]W+~tvN V1_\AKÔ^FA%(IFGhpoz.|?\˃'g58cnDGS;z -c(>^>2p6Hj=WF[|l:]/㝝(E_*~5⚛<exqrkRBUmï 7|*w-h I>$ᅩ)/_{w7y{ӕ}1;d@h̚^U@f x?v=Tvm@O94ߙD{o|a~MS4/ѵ={˶3>|&^䷞M'c%v4lբ?L<)8/,B.7 m{pW LdBsb A&]@rK+j*?$.PR`ў%hpDf.ñ %@ 70[~>!kuf!ƅ]7: XG,ua`ɘX `h8,:&R&$V[UÆ.)T!H`Y= qUd8gxH1i O9a|(c᡻ ONkBX#mHF*O' V5O::;*yx9ARsZQ 1'PXSkÐE ﶤ"T>=|3C5;?{љ63C[yԮc˞^u\urFk3J~K 1gYYǝĝ[r$<Yo)o'=+hpJWӝ]P2U.HF!ü!64eBVTl7HL._4n&T0qUE@σȒ,K k[r,R4 ЧBAJ()lQa|Pt!b$[0yŵ ŕ'n V‘J*iԢk"jYD~F2ԫqk.ݳw&:Ev2ڼq+j+ڊ4Өb͖#:3R~Uij3-DLdU՟XW\9L\N*N~$1<{yᘞb-}}7%Ai6Zefχ4uUGTBTJU_}'H`M>&.ڳ0i%;]!Dy)6Sm#+b5{<1/5={y^OŅQ/w=Ikt3Ve?wͺEg~.%ؽ;ܓ Sׇjcܯ68WL=sH/9F҂u i'7A+m%=:i5TNCdT X`^i>pq’rt;afryQ?C +',(*i2}VOk/m q~;ަzt ۣ%+)1b{_O4?wrz~ (R3 EJf~<V_c<nn<.=̖[l gwBi<@RrC$E_cMsȜ,HS N6HJeI srS'BBTʲpHs(D! Z<;ܕvđf6Ed_VrԮ w[u)D*؂*>Ps@ԑTW f\MSC`ʈA0FآzǮ0m]ڸ 5L6 -hdb`P*#PmZc2 ^*H)P *7jCEI2.7`QײO0maZ+ 2"Ts)v#HFa|7H` -HRL7ɒRT-$ jT%p&P Ni}7=o6rLA(SEOޖR]] R-miomi/3(ͷ`U ^ޖֈrB bLJ6*ޖ֏r̎l(`VPRɄ-mio]i/GҌ3XxPP -E3v -1́"+m-#EJ@u@*!-miomi/\l7*$ldPIWZS^)섩S͵uZU(^2͆it7> T_dGIo2~%WD/R(t~8-z`9W)CfR)4M&:Te)>ѧtlbv52w+-*ƥ>zp[V]Z_Od$M 1,橔D0% 6Tny Lf"l:%2!:2|hkBt& bvz|ue*A+i(^uUƤrGWrFM$E]C+'EeB{pm2m K2*6Ebi- QJVJjT"2[[tS CA" IMJPMj=4F`O)4Ck+}O4aD㯏5]YqֺоjC2N\qPFR<&q#bejE,5$/:R%R!aCi~;\U޼d u*"+DEm VU{+-IR G,tn š$m3 @#ܪm#)0I> C2*l#QaF1 6Kmz?ڧԺ<[kh)}42ZgY1MrvAG{l8ƪdHiL2[2us%H{q2IJE:Z (n0BAQHo(/[1lcYVRdyL#k /glk)C5繆  U P*ƅC`Fu8T\5p@ZaC`,70r\\ /2(^M ϗ/d+nVF7WLAi ژ)D sx̔ik}m :l8- #W)5j M,sC \鱼5)@Q: 2 N8`:`,1X:˞t5ft ' 2 27AFX%V:7= i?R Kc"_I1Llklߺ ĠlH!*MJ5 Hb_n$ͨ)8XvtIPM4 B#0fn hK4i}n4֢0ƪ^K7uDYg؜F"vþG&mg9&+EfR%<5E^QBʧԕ҉c)w.әҙgASRJ;iXizQFqɘzkqcl p&MRt!4 rV!VHb8pHspR!՜#lԅx<BzRc4"V`cшhcDS#€Zaj: @ Ag#V@PPEA!_K2y&)ZZ'-fc#9w (?hȣ= 8%(щ0L5 qGSA&/u<rı9I@'jjcQ(%-i5iU@44:-D;aR|,n|'nuzs$;z؄M@^[> B[|!) tZϫ0olh\ZZZpZ Q SRW %'WZ!B.JAèJ%%!SxtHnH(FƊ[܏¤&ݐݏˎK WK(%oHVj@A)ȳ! DZ"\( *HH bu88l얶En'q(%Y-;yZ AH JODc` :"3rDȬ爐FI.`;6HHmL#YEHZq iʹ|Qj'?1s2RiUP!BQHoPPJA1!`j# dM0*@VBN4&.b1&/j"Dk?P(5H&B C"(V@a(6\AtSrV_t})ÚL=F |'lZT*6M6}8r=dl0_A٬BgT ~MFIP~q^ n/^ lz>k79r#V>%Ԇנ6\^ p5?fw;k+p/EZG 7VEuVlZ"qh +h"5iš@~`]vJt٨R B #:3@HdS= fױv V T!zdAT0/WYd&#BI*s,?jl9 va'*xJ~5>'7S0v59(3 V{/ @a*lDd"L taD&uZڨbwWp_mNiv2OVy? s<POH~ҜJP^|*٧ /+P9;ܔA[V9/oy:>-it] :A,+|~e_;PF |}[2KeYjt{QÜ[0<>Ѿ/\Z,'DxRN"u2^fFB+:LP2AOY:sh 04J%PaöDd)`c4w67U~mBN(/9|l{hWFiPڗ9ƞ=J{DžWxPz\=LNx *^sn iD[Rv};0OSuK-0(nP|@f{y<1-oa &)7g_Dc,4)kSM`~EVg3OEO~#.wrf}sgJ{!nmrvLzV&l[ f_+|evǧ#a(776pp=Zv~a#|\dׁ+Α3ew)w@>a!l[WJTjT&p :XY0v>|C5F?T=H*IR65\ 637ӅϢ> h&3Mʹgx"OrйH&°j>_ _we9n/3r4͝_3GKc9; ,+GQ\ c7vQ}jb+aS,q\>6 m!,bzw͓K}QPiFMƯ7_7|~"xwR=#%==ܦzv{Ulkyf{MkTarw3cP9Zg5o;.ڵ!'Ƿ=xf}i3Y2)"3HK̹$2_6E)0EDɮi8p/Ei2'5ufͱW!/ mOT UII|/LJOz"ނ{yU}R"|*ޮOuD̘(l'dcPz)yBv94y#ogiv<9=4S^!slKv=Y8LLM;b60ĉ9L\Ծp[n=CG6jg5\ϿZْllc;.JJw4gU_ }pQ s|?هү-~YGR̮ d LoХ$C' W .ړʭNYvZUu0%w%朗۠=/r(%WC)*%O# ݈l]K껇껻R}W}"cbG]Hox_f]꨾-I^;K-U>g)cGn@&7fTid*èwB MLNq e8VBz|Gj:}ثiNM:߅Vsp@mVP&wvC-Ku`Z%)(INYGYF .}F 䏯cy G<[}e9JdIZQm`4͇HgЭъ{T#Ň& Gۃ9nadY:#+ ɡ,_#K}ȡ }xyIo\g%DZL5^;{;qz%Ӌ- -=-POvwH̓j/]0=V`IS>jB=h5y6 i$-vRVcumV1̌,7GRf`fll٪ʰ?rnC6{-!ܮYYvsּgZsvR4/g<]WwOxij}h=k7RDf_GXmNJp܎cE;V;cEܱ"PD *:e5reVv;Fβ6ˎhyJ5r]crƈ|%_}q`f_;FN<Ծ z>NN|I'܉}t>sŪ-qqć%.CvO|wq:]|L\7$}h- u]-[͎ 3Jojx 15r Y7M0JF몡;#1rz"Uߚu6SOu5.AkjQ)S(Mhi'P:dd︝w5sX)V!(S%mMMAGb3}Ⓨw> kV;+7Ae9m9WBv2vHޝ =&s^.sF#xX&|5fWWWE꾩@fZN~i].8;=;W]_.K~4凉MhBձ f/fvq~E?e%9;:@Qivt9_= LWF/çY;Tp9.|$[m9 )o5o $!g¿/ m?H'ɬl72r!6욙 5t^NO<z*!]Ӈ8G# Qz8݁j-5rGkZ+X9ц.9< n>f(eв8y,QsC[<@ |uLٜm ˔s',~! ̯g !8}r<81c#< g*< ʔ F+?K DxV ۿ۫z9^͛9 >8 )`N>f{p?]ť?1_ POs!/ݫ$O'=Qv=qkC}47: ^=ણ=6tyiv4xh~OMRŰK[Q-ߔzMt#h׳%6lx|?^s?>^É\ iRWo{wZ>BrY*j5m~ʡOW:u7]2t} g)\4{݈r"'j|Ɲ׋a֨vƿ-/Q9k.G<)fmC{u^>c]>y/,etG/Pū~?㞨Ŋ?+[~9N-wHFpEvhÂ!5v8!Ģ6#μN!N>>rX2ef*˰Ru._ޜϯx }Rї URRZ8,#( Ee(] d9۞<7; H߱~B;Oο#. tO~5PS5sq5Zf+Zl2MVLKwHWQw9Z/cL(}RHJ2Bjm1˔ fjE6!=ow Th&qd*rf[PғBY,CFd#i.9́!-{GB:6DkK @C.:d93?HkPZԬ*%V62JZ(%)PFoVc䪧~MRkIm&OB≠;BQ\:`A?@RU@$:m&&x5 pRd!>q`"lAׂ=!ոTJʱ68J!KQ)/3TCU6Dnj6Z-&` Uɒ*B$BH%G0I,eķ`46Zyl|;$l`F'4k`QQ?p-ةP ngTY8셲[U&ʁc,M4sp2hȣT16l MʺUoW+k(jٲ9&ז/#+ IƴAi:k s1 m8e$"zP2[2Z@pA, & 9oөꬄ*PW M% E N@;x6(IH \^ qo,s cvEP"ī2pYePc醳Ȁ`,)v*> vy >`"eFlU{CJC3q3F0"1mU . <4,ZG G+\ NR$Υ*#] 6 ?bj`lG8_Ri90 ۔| pȤ`ggcDƭ v#XEdRN%+%ޜ]B qF-%B܅o bZmU,,"< eͯ_H먽 z Jٕ u gJ{ $)lԈ`yE{A!!_PAm-EPPJnl-@Ux@Itv+XoVUWuD(MXOdg:cߠ鼍_hRKo#f=kcJQ;<0- tg0Ş[ԖTerV5 y^x1pu`K=XvۨdpslR0RnY LMHyA3 v=X^V" cA1J HĄ$1-3"L2VK´H׃6]?{  2: J9[x(nEd06 ࣇ"[eSQ_<9Y8."3J@0CjҸȟ]"58sP*+)Rs|KAǪ䠻Ъp8k 6@h2(,apJZ輍`rHW i1%8n d0ş4jS:QC`P2BГJw /BbvN'ټ̋ WjKG-P fJ&(B #)~)|A 9<- Fi56@:#f9QvE@zJg,xgx-LR/\fQWkq^uKt/Tt< 7ɂ2iﭩlxw>nd  v￯˛ERt &uKA\as\9ui z]Ѭ`xG}6/mݬJys2ra')^?qYa}T@*O{oᤷd&8'M qN*ĬJ3>* b@ǔ| :C:C:C:C:C:C:C:C:C:C:C:;7!Pu@@Θ;tPT@@KucC:C:C:C:C:C:C:C:C:C:C:C:8'`#1WN<#@3C:C:C:C:C:C:C:C:C:C:C:s0JWé \ خ~/!ƲŃj?s/5]:N^&΁ͥ]Y=2>P2X?P?A2X,8 S11OD,ڤҲ DmOEV*Xݐ0Bgn5&VCI冉qc ùfXe&2-L*VXHOuy~ˏp+E"bhe*k%"VoJD X\8.SYg>`bXq *g˭0ɴ;oXi6Ju"b5b5Vsf-.IDFXoawQ+aDBJDΏ KG7.VTjy b gNHX2IgTjXu̜R^$";GE̹DJ$DU"b&"V;L*-kW& ahBe52;qo=P>f^0XykX9.VzOD,bګWr k6ވbuߊb `Dr &"V3l*-kx.XFNk@X)GbD^8\G udZ6ㆈFb9p\ ;,poX COe #ΡXcje1%J6yL l?^W޼^.S5^os,}yʹ_ܰGu⸐`L˂/ދ\˯-{ܪAG4N&O~qp awn 413\TrɈuf-{xήoMqVHg. `%wwTk(wTA~xiX* (TJ>iFIZih* L40TJSi`* L40TJSi`* L40TJSi`* L40TJSi`* L40TJSiSx򝇷9<$PW͹t@3ɀ:Vu@&PE:_)C:C:C:C:C:C:C:C:C:C:C:C:ѝH k3,P:w@@&$:C:C:C:C:C:C:C:C:C:C:C:f%u@0ɀ:(WT@Tġ:Rsu^"C:C:C:C:C:C:C:C:C:C:C:s0aEé \ خ~/!ƲŃjF v4N_:yղ>g87v}gbh[i 㐉U)X ^l Uwk|ޭg.vOe=8 IZtZF2{JCF,Mh!2;⎥h=$|lDGO";f(!6"P+\Pž%։!(Q 걙8Yc//j7M'kڷ}sz:tlyʹ_ܰG7c.;4)Ès;􊏼îF wkXwSkWOXغn[}ge|!(}#TT<׆FKco6zzښ*P(JXL0̗~b+88Bu?MOs-ףwVnt=ԁLç,.zX5ptj7SO?W YfuUMÙD!O@YW3׿C aw.*Y20KzïGGEϵgNT R Tܤy&(3&ME,gR;XKDd %"V1m"b53>30rk5.XXpEȨb@F0oLD,,KX*ԚDj.5\jXh*C%"sF@q3v-NaI1r~S[sIb;B'"Vsm*f`w&" ƅODP633ULvbkjqvJ< n nr nj;jciti V:wrJ]Z]J]@w% +lӉPK&0B\zJIaO]`x2 B+P9%/]i +<+; y UJ^2iRt ɸ+˓PP4^КW]EW?aW4W{ؙso{WztM^l>bxQD~]D&~h@k~{s靟f3}f-Hd<.qc5_osLQ܇L돫ӡQz7gCo>:UPgO㝗)gEݬf 8@iw *[-ՎWWٗҋp}XoX1M|Q '!_ P1|~ԋx%||7$N74<~z X㵡~N|ok'n3',5zCܰeR"S7*byU>/!7z6_.C\ѴwDaYȏrMHwjHYrvX,x{Pܚgmc;ښ7;hiuY ,_ οW?ǓnݍDƊl5hېEW^l;[Ʋˎ ya-qvgYs Chn'u5ixV_5r7a] %9Qvͽ1SN\'U7?QF핯M\4g`?/!8n;GeXt9g,;}_^7]ˋwAۯ%>W͇j_\lܹUɀAVi6^7;We;Zlm-f{>YƸ]qlm]{7Yƫ^qc(>^=q8B\}(.m{nvnkL,.;{ob^yd:CӸh!;̐?CU XbӢw?&q=w\={I?k'R+7s>[ྛy_l\ v/ lt{eB>meh/asc[aGA/g(?ϯϋy0Ŧ]Ų]__^ vLy{ݠڏ^GDc(G< ZOvx=3E;Y~Y0AOzC^w =q4:9Lz\Ŀ͝w(ЃT0:+qYUpcX]kSrj0ٯux YV/_Bh^}I|j9+]We[܈sdi!gy6c̗*cYx%O",X>* >]wӴC5mZ垭i7~rXTx+.cVTQfCCX !a͋oyɽFvl! TA[Ea2% 4Y_8XZ{!Zw>ﰩl)XN]e+sWHJc3yi_<"O-Z;jE:һDE"xhqQf@E Yy0|"j]7,Ypއ V(1&{?dGZM+{ 8 W n٣|Jr) ܔ"BP>K d@#x;|*ZT\[b.Gb` (bȐW1Lzȧ"nEK#`[^f%&3s9'eF`,XF*HE-p¥0 ;0(gSOq}+hWsJLK&3M\(R3ɀc\COV Nbc6d=qEY #ʇ$@=^G|*]bȟ^}vFb2Q.Bi%J֛m"=bA# 츍`~; H +'HDhV-d@ƛz;! {\ճHUqVЕ0"f1j=2^!A#rvonsH$ܯ2˰Ȗ30N!􀤼$/I* <8'" 2w*V n|\x],ydRaQ5] kCy}6"fzL` xNpG2n?n2tqŒl&RKAx~k\yGoȧV4 xQ%)=L#q:QBr](sOGO(x§ 8I VBpyY \g' |*sX#1>KSD"#w4gh⹾6@Ol5oAeo8+`aszKE95Oq7"X8J8/3PVm`pqȀAkD9w#]LT7ȵ2+V޼5{>[x 4QI!N5irKD!'ߔOS0*?A_<بv eQTK»rЅ#V:|CK}aC" -x~GϯDi3w-aWXc< e+Z2WE)Gʸ-` 3+ pn(ɚ9Zvՠ`:0Ќ6E1mvC<݃G>f5j,S)3sCcEJRV9w%l[J#XxF$bp0 3Ƹ'4(L` ]JhHLpIR'+ c"+ybꁊ# O9&1 1}!آaf_CP-'[٬7#l:L􍱼\!:{+l8ۊGa0F 2ZNڊu/*7ڗ[ذFT"b浿5 5| EAG: * M&5/P!X3T(h;Zo0߮A Beo99*Γ8[nVlR(Ln3>O/ύ6 P,VHdU~|]|*ۚ/i\%%K2FdX`Z5#w(ivL6%i cJʞprfӄp2wB&Hu0#(*R=49GҊTmRW6L_YqS; e]̸xn-v,Nl_0Y1>woIx$C)I{"KO.S!*w#Jqr^ D{(-= 6" Fi>^/V/zxnXFvL:^rQO3 сgá! )^G."QPg7Aoyu7cuNhl 0UQ ښ-ckupĊOƞqR]sS(`f}db68Vǃy_" j BuG*>lxH&jyRVz{UHcWu6 X=^5]{&~|E>S/`Xꅐ>t+ #wXi+k~ة͈)8b,.kn 5OGSoʡ'Bf,}|#6avphxU|Bu|*#_Q- UMҤm0i`kU3}bC2 1P d4jVwX>od\$h9_kq7i0DS+>?d}(d!GC"cc bBR2ZQ0?]73c+UQ#0"Ѳubu#~D+CLhQo\R*]a?_mwkzXt=*y?xa-Ywv¬/{Ƕe1>:'# hxuC¡Jo2Wg}+zdLw@FI_WPz?x) ;*iKopåO4 |#aVb qlp+'^Dicl>A8b*u@FÕt|Ѧ+17 U3 V\-/kJ()8СN&YZ`G2 H@'PS!X0zR<]|Q :mJD&-IHt:88jkFO5}X:`\{D6l0xx="yd|qgz4wWD5ܣsU׳o:lpom Y/# m\o\|= =_]OqpþGmb^﷦;4 ĭ"&.?lnBϾ0|N}ĥ8ecAV˖D@l U;P/\ϒɹzž׻GT`F) hx3lpmD4(#Mw(`'tנim|E _͈1$.Wd%8]~ ^44*OwdO|*/w}%菜rᴣh)jihB?:ªc>Tb%ߓqy|dm;m" 86y8&HGÌ8a A)o':';;jEs l\f3lS=b/8\GEjK]!A9E"B$?S}㝫!9v [.bLȼ]@CL!&c<]j )$UJ3 ݢ܈vH!ͻA6z]5:J(2'X@i?Є"TҸtF>"r.ĉI2xN|]ͭ6E*MKQrފs>g8C>} $Onf ~,IJWZ}`\Tpg6iD@ jh)&Xɀ~GwTȵi)0^\qsc˜(ڊ]fQi1TwfW< PvFqό op60P΄rJ6rG4mdu p>-{bPZL(Ȋ9\݌i%%89Խ]Z"lA+|"ц"Vl? hAQ2,u RHg2J\ u_`߸@Og\ܮ64ڇ} !PS!8屟(?6r6` 9pF^t$t*]oS▜p }|*/- ABJBZBΨA 97qa= Qv3 z[1T8Wf0miWo5#A2E{[zڞxq{F=ȧB.e )~G0Zm*`Q\"t}(J֛b)cYY" sTя=УF_oTϱvPt憑 q dO3^iwW |*3nQ2ZVN׳+rK;¦PIrۍ##Fk*_Z^HpEOI_D_;~PC_;aFJvC+_+@]ܯ;~cVE67/pLI J7^rx*﷟oz?sHT*HfA6 x}ݯr/޾H9}O+`藋眆H 㛄k~=k#E/w@ w9'C#q͞W%ή+vK֣mc[]ͪfUb=Vx~MbOǏ00pzw}wҟE<|,= V杰< ZF]X?l7Osǧf~}Nаgϡ5Y/͓ԃ?vg% VhŒ|)}|(gc*`$km.E)w e=]U lgQZ(J8gr6Slǰ, S[l,럧ϓA~G{rT)5,G/q:Z\R0/~1*avhmZ~ox)h.١ԛ]q]͒Pmc#E؞懾!W76Ja3';Y}5z'}LJb)8xɖ`?@Z\)}̵2sM$հZt V>;|α0:A~= x @<F+dCVIkSأd\/'ݚiF=-0r9[|{ g%7.@%2F۠Fư 8Vi׾]j{:4=0y+OӞ#ȯyt:bRp.{uCD7fĵrWoZ6Ep8V܃׵'ҹT|\}f<|`% zN} <N"9ykS'zRhU19_52ʸ@_v}7;Fߥ -/q:Ա1O.eMWkR:V Ydәt^S!0N$`6WQp5M7{9tz> P}0e݋bhE`i-_ p(5RL_Bo3Nsa2%S".kQl-omy7ٹx >G8]^\9@Mքc9ڐ1GZ/~\⣑?5Ta47*4-Gj|U {buZoBϳSrԿՋ"e`?|]M7s&.=ދ0tZNM#LCZNZZKn_; M酙K8›4v&j\A2#,M඄ŎFhd0R# 7*RS*Hy`WҚ\ɕnj8ơ>u|Ws 5/e$B{N@R`*R+_j{S=xIecnC1n&LUe>)F̊w7C yv6z{N .7!]Cӹz]BG޺1BO&r>xZ9*1z+ 1}<>؆*l9=|D{^赕Z)`!cl=;bw^x(mcdg1bYE3b8RXTBI,D+~ǂkb*߆(`~md*As~8L0u [^HDe raE|\h6k"c9ڐ1G3xW:H1NoH°Tdr<U{.'KzW `0|$8snγⰇK'xV w40$%0Q6 g01`Jm#) ^W&gXg ?Uh^s"k4d a -Xi $4ʼnW!1wq8o $ 0oۡnwW' 2k&cmgQO𤠰䥬,l֏h6+bm8r!cڌ7ۨɱr3 td Q-t>ՈZΟS|H{ jr/3ׁyFo[nNny~cFxqnZg+E;qKH\0%RP33` 6Z|^p7g. hRSX,BW~. mA<|mn2Q[ oײ_FTE Z(eL**nLRXNL^& t8.$5LZǝ`ʌ҃QtxeN ZJ[U '\Ki0}b{3] D(#3RS'JNZDcKZ |j!6!N) >^ch+XNJ璗%R"hY ct^NaF0 |RTP憑HIT,eOKg[S08:"S$"g^B?􁯁^!io`쁯!c*䥫RxS"gȦKWfeʔ~oqskױ|m,82ZdޯJՔT8"Ӥ`^, y %dTCcژD poA2cRYx,IҥiC"fx9>BbVa8ע QzT"JC$"i{e c%jCSi3 c*`*_$g<#νNhWuH_s-\#S5p7T`}̉O`"\` 78F]vw,Jx._BÃlw͎Jn&c[IK jH'ɂ;c`[e{2GFc=L &cw5 *_Ag8L[PRYA 9 Ī-6nzGRoOOp<޻wo^0Orv '~HyT$  ֛R7u.QMY)pDA#9.XR$k+/#ǣbj|Q"i͙_=uD[8RKR"L*꒕0U=UN5:M{:d Pfbn^=N td Q譕-=Y>}7l;`ӧwym4L+`J0sуplU^ua6 N)WYyvsTs)n[o(_úдv2EIgz U?~6_va2a(*"-/r-h$Bjвkl?,5@gor!LkE)ҿ%X*S\QQm-w0Zmݖ; 1+k_ўkd G-XT ty!8dl+b$ ȂąTܺ,)Ϲnj^G`o^1DNjVqFUlSR N⭙)tŜ7^jwn!dųVBQ' 䠌k?ZR:099D C<-C1Of)nqfEgK2' :vh$TVu<_Qe n_}#\; >9CS~3?OG_zSn@s6ˤs|6UNbHOsTԩf&Շ.P~}q&t#_p7 ,P)hIQ;W<5Mb?v*:4K?i&K׻4sהʢҸ!DAw8_VS|U.!yʘ!.$5cCRYHx]O*ןOa]mt'W$Y~բU x)7#>V/b td ft9ne8]x$.yr6>!|M89x@K,@_тVЁґ^,*{r2U4G p64;MlQ9aQr $*gT"&hP>mx|5+ 2 XTH݌ij' !.] չ.P̓*hie9Jj+O)'!J?5q?"2Ao.UO\"񩗉!$Xj|'lAe3F? eԆRo{}B(1 tvw!XT\d4cac;`كTvO hx 0 uTۯS`m}ZjaW]EY!tF+[8Y1ƽُ/d] ua!c})UJZ'АyҚ^`uU?RHW٬H!Q2$ M4GE\nG!__SE>lc~iru`Cx)ti %|cs)¼> hi1ދsݺuHP|:{d t̵Qc\ p)t% /e/v[9+ 6)RFt(5 5/ie=a[!c-˘?JcN`j(ټwC1]&hN0^M>:{d wə&&`DV?I/ c6Ibfv6~un5%Eݲ//٭VDG6[TMVRUz js\& 2 T-n iY}3x)^o"ܢ\N4n4KdmD]I]VoBU.Դ~u|6B̛ o_nGƟ0~歟1kݸ-j:\sb}\3(fӞyV7C՘NM>/װQM<Xw{>3spK@Ja](\&3ߓ,*'-ί/i<(Z6^O|7Usu{Ef껢^~".7*ߪbkh(W Akѝ*^WFo߽~ ~rl^,9|4.DwcaSחM?`3~KWwNꫢ3U.ָmOw:Fs{5ȯ꿣/hLܝWMoe`?7knUJ(}CWe9/.o]HNɹoOg-Eh#zT!EXSoxl U{&FiB_ߣ=:IE8Xǚ!}a4ak j[S'gcC&>1ŧsŸP| O@Ǚ &9ʀtYeN7cZ\zE3ҩ[Sw:;,tђQq"㩷U& 0j0Dxݬ_9YN r7~x[#N}T^T s?1U(5GcK1MfہBHۻ{}%e*",A+F#K@;lCGhUWcOB˖xGM}2,xSb8E𑼹\oDžo-~k@JbP5{#(ھFA'JAcΩ'uUJA3?R})~A`O:ϨI.F[G-#Y /~nT_Z^b|z+ԀѸvOE9 PB# /)/@ 9Ba%b 6]1#$z~]W**qR"lR)~7Qw:Z־b{>~ҋ„ :G8Ӑ{CB12LSC7C|Ynf",s:B?PQ+rEDžǴE`("UĹ,~BJn(P r*AbFA/Zq:$+IŴTNK{P$LvZCK ss$*vB"XQ7ww1 ,'VK*LEp`ܦL^ajPy26 Q\:!$Q¥5#x% 2?aʜeȑ{K[c ^`7p^`YB?ʻU-{ MR{MF!UHcwm}[/Yq,L?rRv `m>&oC嬬-0M׬o dJPxQ}: x12a֏MJg 9tHq 8a*atsXu@V U~C_dMHN$Lʺt{_N~:`BwZT)0gc,u8N%a#K2`qC#9T^EZQx2.-E mC tf,T* `(9r3p׮Oqp=^*"*"x!TV)=ݪ>in%d`aXNR 2JP1mG\ۧM!>KAwKrK^\_n'_4E,g,*VOAMQ V}lx^gP9)lwyIhng輿D }fŋ40/:ai$۾_ш]W.?Pls3k \h|*'(KW)́TBp#QFs Z6*O/ZQoP)"BRmD D;Nh)m%PJ+-kܧ)hN6ҳzyqCrr]w_?w5Sk-<]9ރuvSShw*Z_wN޹ϿR}^w=5*Y Z{{R[&Ԃ+X$mҭ< ޙ c)wHnk09qybrz\i2) #ى9X%t$EJ¨c[)u|:y!~6Ӑ ~bˮ=klS&_, .%$" W{x)&sRnZ.8vwdw&(YApʗ 'a}aXS IJi͔:aqZhLmx i2E5^gLoĦKWbHl:/-0[#0uT# Va!s ߾B?=n"kba+68Pv ]DlNoc”!R9 st9EYNRf04H#)09֚18$9~82 `s=. ZҠ0 [<'*qL\W\ d}(칣0٘K=ڨΨGSlLTbq{$u9NU1l80ې%* Z¹RoŴbOWǹH\BB9f4u X2Ixq2[^[0oa$(2:B;y(iҖ 5:rhBܤk#hcwl00: 6H,lR1l5] U$i$@̎d`W"f\͗Zxxl>d\4Fk{\9s r2 |)uҫh r0^m^P:IL{uz.Ӳx'xׄWWAW<{5M\ U}Utu%9F=J_;4>?<˄VdkoB{PRㄛL<+8ྔ UHVZHHHzF6< x4=遡xF#x# 5.Llꏼ\)%ӜdZ]-JIG e)ʣBCp(R5~X`6CoHX2=GQo&u N &DvK{Yګ3.ME%ٿ_dz4ޑ^WH)'}-^2l0&d[<r%b%6O3ecnA8Un 惉~cc>{^*Hodxa|Uݡ1[EKp65Mlt!ׅ:_x1 2Z(YdT^@h<ij qNTnA/!Ho읭Fz=r=R P P2En? ~#Sa7O.lD_]qeʉdv7ǡ 3ZCKrc@TBwCCTn< =z6OX"w̃S#p!)5@8+Ad ,,`gA|KT}<7,w(Φ}} gM~<Φg~S- c}41Xv(~Ǡ2/$a9{ȍ_1@pK`\CnnmŲQ3A+$[cR"[$0m5XQJo&2ZSŔrV"M]1R,uFap̣QqJjx8NKDU1$@#s]>@08Rfg5_^\&( 0Jr8$VeŵD4 i#af(ẇ˛M X*b^HUW`y[Pf,Qsrmb6JFyKic"p2)5(sELyt˙cNk@^vW&`xfϐr?u$x_BWi $xWzva02#!Jې4yT5>da땢B6.` ~pAo*B 1[KBsg6%7`RF/@R2 R:T2^'aީԩ#G64 #6]y,ueX(W*SkԸhNk %]GQORJ٥He yv-Ο<_F ( 9MVlE(%%JebXe?H]B;jkre4 !>)y5f`GVJ.j'i1?]cs ( ΖۙLjX2( @HRR0h(2BƨZZ슪빛̱=@뻖7˄(I )%4 r["!PFڠ=_0$:QؗB$eDana >]I4aZZ=bRPHpta&lҀ e8($ǘHNnJHOHP!==4[O$*mHF+Cҫ*\Sǟ^ FO+uK?g(atpQo٨eQV|vp?[ ( kQh2}G$ZbӵVt}ܹƀ_UzNE-JN;=ѥVׯA (0vHuexCA-BPpk4-xoURd D+Ϲ,Bq4q -Pm]A&8bHh'Cp]d?*k~ zC)lV/ H~wʧQ!20t&9R/^@v=2!"[ BZY7ұlNOÕP\40 "݄NY.aSpZ> ݡ?hGq4X!(ژPHSQ.3` { "KjRx-Ta24Nlt١ٴ\0@YXTژyH&eJ?~GF^0 VJm .a?Q wuQvW-ScQU*c%jN{s" V1Vk] 15ϖ.*:#]%aL1HtE˟W28цK*q̆ 1({R64JcNzՄ#iٴDy&] Om@xIC0T46ADhg n\Uj0L `yb).CTʮ.}7|1mS'o%xƄ( N oQK&iO"DP0ѠVPOR HQGԓFŗG'11GJTh=R ֧lǐ.(zk>$$GT( Nwe8Qd}ȜԘ&C[_],IqA8@#ۄ")PmjdHHqVQFlaT)wyЈRhGg4P3nqߺW%94 cN2#cċEaN4Pue?0@WhVJ y9Ƚ5կp.D?kw4 0'}D9F}(ab.cDo2% Q5w @ ( 9%;EiSy+OSV1Rs&W!_+ڪFcBFC+CT36$K>,_/!"`H D(d(;,e#1)Bꦪ(4 A[`k׳Z[wΤGFe.jD.s.^ JANLx'5cL ɠ0<:prc-cY0(eߝ }PLNnh<$ܦyEh'CoCK5\( #j(H_]B*^LbFap;YbT ¨ fY3RDe°$DCFAәlW<) "R ٬Y5Qu2hnضG5$4n\(Cyy-Dx1O:8na2( %# dƤ y #K"C*!Q6GFw;NNLoe "U4#IQ͂ZߜY$g$iF=#,W;~يu}3I1:,*N( Nmmwvi֓ #!(l9決>)05^^hGɬj9i=^0Z"Q *jDnd1s z( @#es{"qɇ>)"׭sAoM£1c0hGSCߴ>uH#-ԣ7:jt ܫŢk; #j bNpX kvKۋiӡD^Hcbڙm/ߊSHDuJW`V$ӠQ.}%e[d9>r{O;ў|&a]|E9U`؉qr== g=[䳳o7M:=ͻb ̍*o_LoW?nŒ_}qn/ow޺#~ߴFw&qIk`wCWe 4Ș~9/ٛv~L `ޮU0T"L6._㟅M9kaI[c '+{[[m[m»rg$uq)\-%C uewi<)UzOODZ:?\j;Fa3{4uv=F[ENQr'ШEF-Ԩ}pJb>!q6>qSW݈oT\]WqE9/mCp!nI2pU0&% ^c*QPAJyv~v6m/iQ( :w|'Cq gR 3sm]Nɢ 'G>{= نBGf;c>?bo՟ó_'?r[8z1%d_&Mož|a5v{5a4\}W_Q&,F'˫)f<*.pfūVM-F),cTjd†9t]&>%0VmְKITJ W q~ *$YCFR-Wvs3]uE-ʯ/Vqi굣2}Tc^yvPYn0u 3SU73QZwo(oh#{V@A[k*HoMnb&W犘%rqx>/p Elq8b۵zkٻ6$`R=;8}?\ E"$eW=CRCĦ&3ꮞ.W X_zLY(!r_sMB[gMSԚ{K/iY(˄z: S{:e٢=W糖\ 7 `T}dr9}_.s)w€8ue^v?\yc[GWCR"tM~ӥMsf*ܓY GQr1HZ0rq $1 rXצ,ϠMܫMכ4&Y$i)!ǨE`hD4Q8ĹP"j GLZȧ%$ia !p;NakRu3}&X;pB2=Ng=A{3('h1޳Iy~MʜR[%:Yc̻zCF ҉B|É`xOje?7꡵аWƢ-;:-rY[/菏WuMiYt|ߟ䟵kIxJ}7u"M?VwezPQGW;r o ОmTox^!WT! &^vI|ů.޹\y!W㊄/tP#0mq5'dKb]o< ͂VֱN ,6zJ?o9t=ڧ,?C,?c>dD`~ n$5!B RXjS/"x @{zS)JՅeYx=96g30PEg32WϷv |NV1}j9'tc[_Ɠ/f[VB|HmU4lVRKgXA܆5\X޹дNNV˦U 9+!'qy` \@&5rSg;=Hd.T8v7_t{J4lR%o>:/n|^w|zA5}?4/of9^L bfOndI'k!wUr?ǣ6m<]K!y؞/xPer^ңGBW@ceADq:~2c ÿ(N@jYr~i_rvSϷsC5f=Z# s֛5}Y#*׈ kD/q()T,sԱ`Qڈ!m,F[ i&CJ)g-x]%8DXԽC0PH:C!PH:T C!PHdBҡb ECYPV1U eCYPV1U<Sb2U lX*b(*B^ $cd(PV1U eCYPV1U e T l8v8v4 ; ; gR fYaz3Pc՗7em)yo0Z f/ fҀW7=rW7R]em|*k)^R0ާfUUVW˾+Ж'~JK3 &A}UVX޸}qWY[͟Z6O$Uݕc}twu7~wu'u9~dvu7m#irWJ]O`R]1K.鍻2U֒]@wC,sW0gqWY]⮲ɻƃzJ#lpA~rq1g#DcAoj>WkgT2zhhՇ?NG]6u\˓.Ǩ^R ʼ}th M~N8'aȤqb8e72S1F-2B{hxI ӳ8]=bP˓qcK66v0gGw]^jny~:=~#p={+&zG;䙚VM7Lc?߃x}ŲK^_]mJ'mN ġ]㥝`4Ņ[nC,0}~,e(yf v]Dp͹ҒJrnŻ×hэg yw#QTQ3Cp,tTlϴq%on,zļٗ2zٷ}:X.6喋|u2EN6)=w/AK$WED)[\( O| YJ2ߢI!ψYn8Ti&(̈|V,gmZaKK\s{@a7rWY]!⮲JH'=)}G2No3tݻ2юh2Tōp6VXir01D(M?Ϩ??s'Gg']$,xCUiFEQG-wHhL-mAB \Y|L^(so\`J2N+6JC wBPUcK-҉G䱋Z`n3JҔ FJk@` |lVeiblXa0Xվ%EB Oh5ueqL& o6r*hAB  uh iH w?  @#5-n#aJ@ Kew?x1 \VNC4F X",q|&7*_9{mm"ʓ"x"pqMCQ`0U罳I[)W:G -@D|TCj H(aS:v2怢 = i*$[lvP,Nr\SD4F:OQ*qR^ %s]|y#c-ryKB9,m]$/4)\Z&sO(*HZf.J'} ?#^aKuKTEB #L+yxo[֔LWۡ0tʊ̣+]鎮vzʬ4Etk]!\NBWVc+DmGW'HWvB65tpmkqF; t(s o]!`ZCW+ۡUGOW2"tJp"G]!\BRv+ ,$Gq7__̍sa2hrϜiN21[AK'rNcSgy?ʉmiYVewnz9F@2"I!ܳrU 2)JHN 1} Z^)y`,O%Eg8Y?nɠ¿O/E%&\I!* 4ro&&%*EKZy `^u]3a!\ w>e(\?FYqyx=׷#]B;rPef{c^ G \pp˺ipaX~j*,Yo*_-MkӲ$)n'/ZQºl} 7nC/7mѯx޸"bc81_d^{`ֿ&׷ϯ'MzxCW@C4o},Zgy|ISybn0z J}m 8h[vv]THI³Og \xWfdL A}$W~42iu6Ϥ|?K H<`UuƠ6o\敐uo']%ٺ&*#PL" nMs )'Qk=d$I)ji0&I'꾰SeEPჱFBMG-{k"W.dTS"q_ gIۭ7 BZj*蒨!_ųKfr CƸ?Z8\B>r|IMM!֪/I?fn]m}U3x D{6 SU*2i}ة#lKˣJ:$G駒D%~2ITwKq.]έ$)=RMw2g$ZcL045,Y-1I"$Z3M2dz%*4V*(TN>>9MKd-X,m)xԇ^  d }6-[[g׬2aptmnL҃]i'%%EiNpmvZD{(`ۡjӞvZ Jȣ!+f7ϿF~1IAbf`x9si-Z_3;t$$D㣍Þ JBo rNkɄC.CHV mTAHkaڀqz3%sN!,9A]4xcp6O'g#`xzt72Rmx9|ꩧϲˍ7):1J^*LΣ%&<\KU,EGHxi33Mc|Wٰ#Z׺bM̬зt́c+\,,˛6s3,3)Cu)u7yV'ok0_ݢWUTT`0iXaĶʏBe *v _Rn@zV9E!qَh)?vBwtute+D[ ъWWRv&NѕDʴ9zButlԬzsF$Tu;j+v(ّm6[ЕjתwS5tpo ]!Zn屹lII!%o]`Exk ᶇNWvtut5wlp?hXĚl_}l49+O{Ja冞=/z_4~Ft+w)KʍD5Ȫ =Fgf;DQ+|r]YRX. ZhNzl~ݸ_gY~q_̛$}LŝFH/"ɅAL ;|Gi5dohEzu+xOg&8q Ѡi^%a~zm꛿XqBXxU";gW%TZُqv+VVZRcmΛm V;fKʥҦM"ޞ1+-">"QjӉ\F"JR4$7y&V)h&3uLd1DCvfHOpzeVPحp4I)]e2’R5׹rW$cA2U>&_~CCcyYٚo,)\ Cʼi}jW6 W ÆeE~;SfqEiɢ:{‰o8QE/L7ZoS u b`ʈHLtWczTh@c&{0xH&̶/zӢ3"MU{F,wgY^~ٌȨ$^>ɧK#7P+Q >>99j 53[7u)4q*O4a轌S+~-,\c7yEtR>}ɮ.70`Tu*ЏRNUPx.aqyE^':v) beǮ olpe~ϛ(ѥ|Sly'^{E'l[RBv^:I4B5kZeZHߺ-d]Bf7:yFbi>EX,pM[==-ϟj@埓? Z v@Ge @O0s/V}EDY8oSxX"א=~RT}`ttu2te5ƴ++X[ ʣ?(5ЕY F޻VPJ}X>92rv ]ZTu 3[CWW}lϨDwtut F5tpCW'+DiXGW'HW0*г2BTNWRNUa}t1+l ] hHW0Mt%m]!\uhѫ+DiDGW@W^e7έH#1Ɗ#2h0އ`PTK'UJ޻%uFl6=|R .zN8M F!UxYwݹh&RWO.G᫿j|p]1e?lK^w^~$ޑYz+s-vcO+ss7`5Ɯ0V;x\YkGTF@Eʇ ~Cii8@Y͛ж>Vρy*sOK=DqzʑxngOy&qΤI*Ã40OBq#\\ɤ* eL74?g~`rn}Kh'} $Kgo}Cx W8Œ9!&&y &:YІ+ᕗAF*Voזy5`PWI&>g)NPDcSkYnȮƣ)v–blFD>[@ uHn+-$ p2sͪpzG=#-&_ZcVB\hv=-QѤ*CdK]BȥaUV{cv,u >:D!?#.@#6hHoՏm}yJE!i/Nzü 9Yr6_b|jys*ҋqZJh(%%Ɋ] k(sRB3 <6t`1Lɇ%J1)D$6)hK 9¯PZ)dNX^DCZZ壝pPR<@xKdYk\mF/C(|NIbѥ`D)Ȃ94Jt)*k_c BJ}Q$\` &`ҙBvP=5P]ju;aJ$z[(_ qр)k|FmxJ,v **6(:٠-;< uԡn#Xe"DM2U X.JS/ Ecʳǚȭ.tt%E"n$l@ Sk -cwXk.4˺ F@B!֣(Mհ֞C*ZD@ %X_;wMBA[ŨtzRcS'[ZpUTiXYi)@fT `6+hS6mѸ)Nqڽ`AOuJ+j+;.jP!ՠwVrEdܠѦ!P C9%)Ne,r<&M0LrG !.A f|) :)ԙ5TD12إ~.V!zoSCAV(J8E9C7ά(h ;:IW qj2RtW.{/(uY =mgT0-([jkIk ˨DPZM 6Aj"b C$C}(*мGw]+czh2& y rEt7hf1#.Ev̺(N:*&6/ I;h m>* dⅡ[с9?S]Z|ou/ks`6=D2 o b9PT8xi76BO[0v%SdIW=$ B+X(IT<q ~h̤`BgąsA9AJP$rAV **2^i C.5A44/! OӗUd 2inuG-A@8Od!T? y E*Ίf lZA_֊ANEU1xﯖ\ȏ.N&!z*`YedmB&ȈPGA]JrcC^L @oB%|}z З9t,IճDT4ePEx(%'6CܖSBGk .IҎa:9@1^!B ;֐h NmBX[v#hX ʰ"  De#:flNX)j^ /h=V3liVԀά$޲[6RSҥ*x/Gt^QѠM6H*tM>*a* 0dkJ@8Xu|ۼZtY ceq$Yz|y`0u1vtrlѓХE4IPIlQfmT]kM!JKBs$JVO ]z31i'y̯WFзTfĞT2z7$%xKT]ж+9]nDdh8Ne4)QJj *TA2!5X@J̨'2(xE7mգfb + EISL1. m ;XI7y/e腃IP(EmQAR܌EEH,{İU3+~x$mDVc-S{NŒ+5icMZ b %_T4/Q4t Z،P-EC9y9>NKiB]h5х7[A[Ƭ}E >N` aвM̀z=peRt]HlfhJ7a#]2'I{OY(ttJ S\FnkE!8XoCnҘM5Uvi X%T,Xz+DWn.tEhg1Ή9 gYOEW7ڹOT*WQfFtE|9łLO 0]^Jݖ^Sq/|Ξp˪=њ=Qڻbw~/SJpfFtE~DgCWWP:tutn9+TtEp_m (WLWGHW[Ō7"q6Jә!PtutefDWبQWօC+By`720]}d3+FΆ7FlV1ҕȦn}vNftQM ^. q@.)&ډHby@[:yoF5JԳbZի80KwU=fX_]g+>M[ԨʳAv ۗblݛ= AyC>R yXկb%~~&ƍ&>]r'NbVuޚmΑ}"xX'$* @84i`C9bɇhT-W nC@+8PJuNZ?#S|utd ]mPstE}NjNGW섟 ]\BWC+B7%]q3+n>z1"wCWrǥ"g_^Gt\Yo}C_^Ps=J2]=uetå?]Ijl)( / 2Jc+sRW ]\kBW֋C+B<ҕҺ9;=fF1Zԑ m]7ΆӇNWЮ 2]}r5#"Άnps+zl8vB) ҕA*=#"a3h*"Nυ?x"*H፜]}/AN(f:BlMGW8gFuT+ҫS! /OW{V/|jV/CWtRpؙf.tEhC~7[nu};W/s_.O` * zõc?9Tbٷï?trs[wgO!] afDjv9~߇Ͽ]Okj{b )j"Z{=M={tOW'.'#q2n`b|3g%@oP߈>fS6̆}6Ѱb"c DrJ:-y0rʁ-;D}U `}¤#+,#v\^tEMxMt۷ͽDpX엋O|YtEoҔ@#uo;qL}>ԿL7;y1[OS蕻`# v]?{ƍ20.@%0f6@MZ<8Yj8ًc[r4zN $<G-yК_hW [d'K*;dI]8U>vr6&^n#ӕ3t>GV68L#!ǝ D>SSfxg"]l< smuFbx4E9#Lѣ\=5SRPSEG2o֕ZP2WĕZ^Iopr9WVȶ UJpE\I+Ϧa!eGE<ʒU)G`K}(jRGhǢ9OQ=j o(5.F W4WkbR@%!hE\Yn TC ZoVgoW+e>s6ZΈT y \] IW aAprEkLW WZZ@\\-}\uW\•GBɵ\ZR+ThUqcA| `@ v\JpE\A篕jZEZETMi`ZUaRPM<9A0;BW o55T0ŞSIkT+6+K WV>.* a.JKcO>ZZK\NPiw*$WĕQBRǕQ4U=̛55Vʶ UjpA\Y a*AW R Pmd|ǕزE(ŭmW+Ƶj  TZҲP+pkSe$ kc5Z+PkUqEV\\uWLc}ӔJ @.Wkv\J!wE\qM ^%Vopr޻Bm((Ze<•Њip/r ][bێ+|ǖӀJjZL`!7 *&ڎ+Pi 4խZ~"L|U5myW X( @l:^KnjZ*G\uWC3pŹܦS+[B* +)(> Ab A nڎ+PY]zp\)u v%C R Pm`'q%-!ԛeV碶<7 M܄Z&KhZ*`-?VrABzxXB1<рg#+7PjTVj#h0J4?[ ;xW UrpA\YI swr7bPM+Pi[[V)D vA'X5iS=G}p ꩵ yWTJҲٺtծUOҊy+l?۳\.|} TZB:+#\`P;@-P%WWT P Z @"B4ઋ^Kj=zr WklݖWGTRJ@pOՓK Zێ+TYUq%> `O *ofQi}J)C*;\uWZo ,d@+TZ?3* ]`IeބQmS6SC쪋ZfA0\\}UB\\-64iS-= 3XK~6᪞ʶ`j\Z+GƼ\Z!ێ+T)EUq '\Wדkێ+TIMUqmH@\\|F:+a#\`0JoprmjZ*u]\WJG#\\*}eB"ઋRTӶP+cD77[nI^Vޙ [{tTB뵄VQd-4/#ԪYR9)Ԭ?#`&Еy?wy'XFKb(;,ōKF2ߥnizL...Ơ"Y ئ~?O{]to Uޏߡ/M oGx86R>pŻ tfE:cg|G.9Àڥh(QDU4dXYܖ(F*9'֔E>,]l_ge+ʇ 9,Ofqz|:NozEW1$1D$:Bci_ibmx47*2:s,NLO)훕14w#(} f(whP@y=-?͆õU:t<fe^pbBES=[82:p߆B.9m6AͪOZUwNQ\vVMknge?TKЃe~Ce̜Å ٵ5~kߪǛA'+.X_TQQt7 8٪+GoG`x8}q '.;lݺj<c 8hӯ,([|]OY>,ƒӛ2'=ZSxϦ()AaQ}g7_M|,"=~]Ar_nr.izVa4*ua%NCU#{==\*hCTkeT _YϊZZR.,7PS [xb39hyNyf%TIn<ς^Džzojst;{Ume 2zxzbzy[kZ l|2z$"չ)qKVճ1=!%ĽD Ӝ?vvM聳 ٦Ut21Y%8+cfrtڄKgdn$R0KJ-;6mf5,oٽYiWSD85SO3w2ڞw=7>;0UF}'A'[&ᜩozI+6v;j;҄(. ȥZ6uq=m;8 W6</}O7T˽'3|WxSm,S'8NrLDX 4,Ms/LXg"\HbA/ Pu.MF0sLkTht9aEn, ެєjhaPJB\XG7r@eNa+kncJ(ˡNT8%đ,%cn0ͅ!W<31noˉ:DQ$N?M8u;(L]<,ѧrWRG|簇8I8'{SQB9akJqlR%J%j[u޺c<7t:,67A˧˘Z{}t!nryJv#^wG1W~,/MW5b`%]VE`hgrU . Ksp=>96?km<2RݔW iJȾ[LX0;9 WaDi4QLJ0C FQ:s]~חz߿\ݏ ԥVF fo %bDFtף,ʖ_̯bE.Q*aP^g%B xF/^Z1RS?Й<k|(ٻ6d+=1M־8B2vuȞyQ(jiY؁&}:+R㘌zxely?~y4H)ʝZg} ~v{?:2Z:laاV~Z T{#QyIV99F5Fݾ#c\K1GA1:ۗ~`MαP瘈s'O=ѵ^>1O)ҞX Kl`&]+ϓ<j$5ޖ[omiL&#h=H θEk3tVс^ m3QskrR֊f]- :\90V4UYCXjq6hPH9(@Bڑ.6RKBdLQA^c\MR 0G͜s(M’v3"اzL_a 1m1g;J"45ti~.<L(j<۽{D9W'V_7NySQdHt#rAr4H LaJƼ@z 1md39?M逺X-yĿṈع+mod߀RUo_iٸyVe/i򭓯}r\s5b/(X630-0W#mS5\IORB;W8K _b%U,ɨWSQW Ǯ%C"g^#$WWfTX]mêZ֩P+S=F uB cNF]%Zŏ]]J2=BRW Jh8vuP TW3I X˓QWWa|**UB2zaVuBg7oZQWgw]] ^g& ;2@须obvxݮ˔Ro3PS?XEoi:vRX-Ps=b;a#ԇ֥qW!ݶ4e@+7U=8s]6O}Y;{QCL*~dM!nwT)9]G7Nh9>zKBP/FQu;9S, ThOc=֔I=c9{- $|lz.+T}&Vt#g?7B߀TRr?*r?*7r?*GQTGQTGQTGQT?*r?*V[1-r?*r?*r?*z@B?*7TGQTX1-Xn;YVXTTEs Rkc:lP|ŧX)np|,C"Ncʠ2zVkYNp53SZ&ĵ r B($W1pV{1gz4‚Sg Zg@η F]}g+.`F[yt50BF6K3;SR5eg^,Pped@ ? c5ix*Q` e +ՒLraQ 7AGAsLQ <(, 5XOY$ <̣u`83@NF `ʷc:К=;_o"niwr0Ϡ0m/XïD洀R`K Y3%چ}Q"oJ$F" "g+A(H c*k$3+lhg%x qJɧQ!=8b{$=+18jG{ z0V㿜 P9V v! l H[DH(򅖜N)bxЄ6P8/S^D`2N$j@RlRR4*s"Myi ʓi Zo ()) l ȅRsmmPBMSULF`e`LYUiaUctVzkY?:_n\Ԫv]:XX`1{G[>Ap_ߤۭ^\YVzyYָ[y !yByl .W'vP* $XIJ;ݛϸ9Fz Z0쏣12!'13^kp >K:3ƪΒK"0>%S,Pw()!8fJPLI_"%T16=|r;nW`o3 +ʟDUQǨ5IdD3GfF˪r w\zY ¼:ݥ6xt7]H1M3B^m~/}$)mX[4gtkPtğ[!VXS6#JwlEԮ'{}b hׁTstIglZe*FLw6ckws qb{DzElp.~*|l3 ),313PMz5 v8ɸ]0K)[+5Է@o3gόR@ gRCmK9a0vkzrJnWg*>c)$V;Mr+Mv o5.ATLCB#ڭBfObBp+Xayukyr&r&6L'idCc$?iy >Sh)_.~gg78E4ٛcK%O2&WoY*FhF.T}ӃV}9[j!U!R_ ʭ5^łaDY( MsL8\Z ×GB`ZGL #z!Uxd!R&R/&"b1h#=tta-׿~yzW|}. s;>3Dh[{0l1wh5Z`(3*h2:ڀmT 5C _cN\Г'85w :^cPl(SJ0MgN={)d0'0'`|?JhNQ4`A E C-+bJG,6ٙ?*ؗ;vNDTm7X}փڅɹdmTT5"CփKcqCAƨq+&-w:e1ϛ팍u+¥k:`BL1jv̂e,<kYXj"R5WDyNakۅZE2j.zɍ~y,sW4N.N7zLZ2p\ZՒʹukl LGhdZ:g @I3/};HeVTxGDJNcRYWfXŠ4TVdzGnX `u>*rLz# Ma v­!pl #SQ21ZNVY"[Ӿ54 A! ,%RDXVN u^H.*{ysl Jٮpڣqҧӵο'>s" n붯-R#D>aVtI8r[>0~Z)}Lr:QŇ5Fux1i3*_~bxӛTd9 n8qƱ×[+` )㌩Tu.V~h͍hV?Kw?w~)x};p9KV9&`n̷Nl_f8@:neht&SW ;q^{To՝nnCjo3*6n`,iߏ'z0kJwucSH])+$ݘwҷ^xUhvIqAuz{W ^m/_}ˏy _`eZjk m`sn[n ߚ*έuM%!wiQMA{&;k(zwuǿ.|ٽe:ȧ{&X׉t%u-? e~Sņ-7D`wKA<ߦUMHH**1JOhRD1}0 %AF=iR;ɏ\?:2Z:la İB9^HT^R*EtQMQoX?R~=$9|b ,>\R=Ca,% 60RIu WiXB5moKݷ6u4&s[$gg"KzOjnP(%HLFڄхA<0&= EN)=sNh+ɤ]i =P/'=sz!+_-߿fE  U0e B$Vm;[w#D!De*(t` YE0Fxe0 ;}i=׍ZuXHǞ "k/%qJ7DVŜ(anǭ:Љq5xOܭ6|"vݻNiJL|?|s;o yBx$CJ 4u JAzfg" 3U=J6:$P^,ޑ.оhJj# :h9;zxcB̾i>}1xtƧ0,yM8 G`RGTPJm6>IɚpFmIZ)Ho=xWsZ:?[cR;ĬW逰 )Oc.*Wb1VٍŕB>UAɉŇU&Iom_xZ+K0 >6̯#Pƃ[o$Zw{ճک 0%RR4L,}i͖vF&5;#dkfZ˛mfZ{bZolPWngJ!}fM UZSEơ:1%'m$'pM%o=  cg؇dQY'IJ$EI%JTȪȈsD<駷MvMB|}q~]a~yvZ2o-GݢHEyrs<3wW;kon<ڭT% W^fWG+)_^=mx? 2hfglk%uk2ߧ yj-X%JbIT0JpMVdl)KʎZ:MOujRk䒔!pTT3&Kb= pX lf,cSJx30f3̍NYVIʔE8h g@鳾dI Y!.#.aa4h&M;J԰T0;!?bާ\UFҚw-"91[_ _[w V&<2aAkR/Ocw,1K+:$xyB )SmR\&a9j*K'jjdU9ˑH%9ʂM/+^Z)*!w&<6FৌSadm43Sh,ROY{E/3搔hVbP[bӮ,vn3,HXF.IgV RBvX%(xs"$v@Gtȗ q*1c!rJc%'bB9+O=0 ?jvVq። <(EJ<ɖ>GX4TK*TzcT` bXKY i3XCfҡI  m'+Tpo*>b:)"BA֛ !IgCRP6Z'JER!uJ 9GC^qOJ2"V_\,08ٖ4v^>zEd(QNKG!Bi5YV+ SN |u9fVcᛯyk&Lo ֭EĬQ@r|1FL:;vD1!`}ff[7.>m:D(%H-GKg00w8xdgܸƁ|ѤƂB'zPK҈H)d"BU iIQp¸FƁH\t" Qktsx$ ې, 6cܤHp#+QϳW5WܶM fNcv[kX/|RXRr,am.F0ZN:G!:v,p da5(ɮ$1c6̤D$K&VȀ^a9{DCȒ,\\gڍl: I,ds ` @Ao,Y 8\-Jn1CIaI6c;*gDH! c :Cє КKd'tbOH'YLg& HZ];̀ R)Ԥފ{)E8F 7)ݭE a M{-{D1dJ-6mM:v'7vӧ3'C<5,^j&LFP7SL3JG+40z:6t Fm{DSp(r2HXuTLk̪k>Hr9)E. `L L|1a·fU(g8H_/k CC*CθX/dH'x%^f|kg-$)Io`^1\`ͯ߶LF'u_o/-_-v[{|3}ﲝ Ė]n SНf nW@KǢ+=ܿ{YM{+8d<nywwG!=Y'iM}~:MD*GAI% ?Tz栞E}٫eecھU4x'e{ !_ӓs3#=];2{.b/fm&v']d#- Vp\-VZÍnFz0in·G+T2xZ&_ML+ U{aH&"\C5zXj!B+(7rC5knQB:BbO& 5++i; C]{>aÝH?op:ý/S#fw~^Om;QQ'@yDŝwNT܉;Qq'*DŝwNT܉;Qq'*DŝwNT܉;Qq'*DŝwNT܉;Qq'*DŝwNT܉;Qq'*DŝwNT܉D@!a #A E߰R:}1&)tvi#_/boz6@$̐CB~LS/2,~za?^\WPiyR߭ gjh`b<#hSHTo72HٌASaw%/[Oqd͹: 59=yx o=uŭf=aos'kEEN,lks^Gu؈'yKos^_X|:Qۭ Ǔ#F'dW:iINuҬf4Y':iINuҬf4Y':iINuҬf4Y':iINuҬf4Y':iINuҬf4Y':iINuҬfҀS|q}>SKtf'Pg"ȗly iH1A vF)@$Ymۗ[m6 ;dOpP3%Zն:KT{y4M6jy\mx? '{ptH[>}]E8|Y807nN*|+cT"JXMx%bK%b#Wcj$:CF8*10LjrL?tpyq~:E&sJCSU E34dU45tsp7s;Ҩs~jآ+w|?K_r~vP1#~R<4ctm1zdg \p)M,[|N7jvuzq5x\n3 hiYT%>!7DsR-?6&y%x[~6̾ξ֗ vWg"mW=-\!c7O*N'#r~]ZhP4zT hq]+ue ו2\Wp]+ue ו2\Wp]+ue ו2\Wp]+ue ו2\Wp]+ue ו2\Wp]+ue ו2?n] PI\Mp$#&H$Y\# aq5Y\jW,fqKFiX\jW,fq5Y\jW,fq5Y\jW,fq5Y\jW,fq5Y\jW,fq5Y\jW,qZe/~tMv~u/0\k/-#/ՂqHZpCVт\F nӀmI_62RCM|A\B?%(Rö!$*civT׻}.8R6 S\[YsWg7# P*Kw{2dݚ Mu%yZ})[F㺒⿩ },o1'`GQcd׃xZ!O u8xT1j7FŇS/Հ>s%DfiMu<+C%oI/Z}]ZliA>ǁxm nqj5M]5{]5w}}J]k{`7;8J(+Yc],hGU w-h.٤'C \4᤟0 g˓ %kIRw͘ 1HIu<@ m3%NY:m̾p`n^B7w_ΡZajf1lt?r^ ~u˅LtFۗxqhn8:PIRkf`˜ + =uKntU%k yS"pETʺ4 US JP)0Ek66f? "IczߨQiw1Kf,eNBm҇}4ì-s'm9|P 2 擟 )n*lRҔ.Z$04_bD{,\w{'ånX(|=H*9_/eS0v5{}S7CMI{"5^f kVZlc/D&3ڔ4.6| C WG7 ΧdzTEB?dy]gfUقƯ407vCfuSX(!|y^a]g4qvN~>7~d&pe$;U lfWuXc9m SnƮZQҐRƬ@궃XV@k^ƨ6̗DզxND .W˦>nSJ gHv~t6EcfL6wt9υ"^tDn XS!%IC.6Gef|ɯ0;IU ZjVUw9qee_BebRcg6(Y8S9nٛ=r3a7\{/9ɠ9A^kKyP<*NYk`G0"ZȢ[cR;ĬW逰  X c)쳵lC W]o7UHJvhk7 j-z`pl>Q[kӘ5mOG*jo@Ky)sݻqv;=`a^_`jl3fXZem?̣,8X|l N>wIŅH(9\!o|Bx'GAL0EdHfDQZ-89&r6ȹM ucZ)"V_"D4-ڶ߂[BɂJ_ig#̻n);L6%'2ڶ RR#CURimذ9"0gP+]vlװwۻԸmѳ/\ slҝ*w,x ;0!H uYhvJ>:Ӂsƻi?5L{ #QHY5aj3jX$"﵌FMFSQ['D|Lz}z2Mx -,Q3' qf%*JKI/lA 9ׯds"u}=3Umy~*f~tMm&GkQ.brK%`:M ,3 k"ZH]w+۷ 9m:*Fkr%D%$@tY$"R:"f|/"88f0 TYGb9AQN"{@ Ҙ:(lXKc/ƴ @A QzHLdu4(@%LR#1s:V8-[Jk)&))H3s8e.ܒwp {=C9T W^UH>x;}Qzn_*=?CD(axk!"(BP2x.6exAi2Vɕ]#z:ǘD}1jOb6*tbxQ׈rE؄ A2 c*5(K)µio8z<qm۩usD+# <2`2te%ە0WS{cYL@ΧihL-/(Tace;깴Ra⢜[4d1-B3>$MF)Pgw-a;"E Rr< TafXT+4Ta'_;sĝ^5A<5VA)N%pb N5aŽW.zuUOkEcJrDp Y4L>徺p߹39+B zr JUL%ǽљ2;8X )R>\R90{g卦;f_swœwo'o-Co|c{u]@dEh'\*[ҕ߾m2tI2˨Ї Q$ УŚxˑhU ڽ%V-{W0:JQ00`ԓpIs|;" 8;&u]:>\}ݻ^/.}uo.?<%t -| l,o~<`iQҰliXuo=xu K Tmnnv\D'#.ݿeO :-d&Xݤt%v? l6Sgh `%,D05/6E}sEgiU8,MglI1vJO,8SF_~< ˲WY2:Jj%q#*cT֨{lȌ?g9ygW Kg~"`# ae)pEy*kPFa<;}K֡pb1ERpv-].T_YXՇWW'(<ūEK X[tܷQXC9iq[+FBBe x ?ۂ*m-n(ȘLeL!/LBR#SSG͜s]%ydcOL8%b"m1g;J"4]W'Ë(],լb~.yڂd{NǾ>"7_x;/ &]\PRg&{LyyAM4A%ZwF ӷP% CZfX5Fsϸ פj:Rj3fȴ1KLD@gqsqÚ* ypVD!J<6l+>TL6/<fp#4 &h3qI`~C`=tV @ι |wݑYvڿ>KhV@R?_JE<Q\B4fcMD3֪Tm琒|l9f,S_"#GJ~tfG/Q-\}t W? 1nnJS)E)/0/ ^4KIh?7tcTBG*K{}DE>1GmQto~J'pv~DwVfw;R/P|gQYZcY.W3׼_'W*^]իo |A> -?xBCWU,<73 -\: r0v/>ʿ:b铟[f5{ڙ ɰ?*,yN<Ʈa;?6v|@7MՍzÏ^HP}֮ۿ:Ͷokw!_StQ.[{K?ny|dOE힩Օ]lrFw4$mi5:m_YZFg0:4:+Ft>fOU |BY ̋',f^fOwC.޹ WjBL< vǬ"ko#sן bZro14%c7j a4C(drKgd>Oҙ~8rp7QBG}UR}ǿb!_`j{,8wA⬝O8m0.P'T l=5jlj>H% f/S12u \> 8Fau@#*)v>:ɝFPfT 1DwmI OÎ`\'n IlgIqHJ=,˜awuWWAQ(R4wEfsȼN>X<+C2Z7·Ӳߺ-VG _~1cbλLdλLYs EY F;aQ(p~6k"bV2('m;julg|FmWLF0w=d߼9%N:mgZ7D' Kw` tJ0np}vrHpQ5 ?Pgqؿ_𭼖䓳J}:f]ƟیBy(. l>;2nE=YA17a<<J=zKp ͰqLlE7{Zd2wTυmA@#H3~4>\g"Qk}&J>_+~=FӀON6^Ŝarc̕ҊyMh}*|y[i9aR/1",DDꥦ0+C $ҍa yj1>hAq8\h.@-͂Ҽn?4֢ k7=: ta^0gn^cfKќdsk,QV8dYn+H$!$r;c$Y7>!TDϣ [&i+.@(h&BAuEG?b;y*b;y*iOYiհИ@W͔RfrfagvV>JY &E+ }($A椭zYyhyzwΛ -,-9N`.0 KZʩRp[ 5klysտ^GysB}%/@S\-&x#32x5sf"Ha#lt=!t 8MfᆕG#W]-tOhҖhd@aD%$@tY$h) Ӂi]IՄX8ze \HL3 r>0Gdi#K+c}A2FHH5pFBb%#1 )DXҠ8KFbRK2Vcl%Vv)iG-a;5YV1Z{: .UzP/xۛC fˡ~s}~3T1/Y^ǩvG&Ql!mI3yṼ@,6)7dA. $nxۨ4W[qdt?$%QGIAŬ2QG)Ǘ$Ͳdޣ,$-[Ñ3،#3W.4mY,}j~hg Yyw²ch{hkT~6up 6 Q2;eYfٲ)kh՞3y1E2_(Y,}LQlbGw0eGǝdy5 ڌ75}5¢$lyM7"(ҍ87Բ\ tbΒq }3+s] m33̬> #TqQ!g3٪IzူKcqCAƨd<Ŋ٨A(9G&C:U؄ A2 ǨU1kQRk^jF5@1r6 2]/%}^үgakpui'{wSbC&\^ [Z4N9LZ2p\ZՒʹukl LG8ps E캤I{;Ji:$4n%RTDfEȿKJCH&֧|=U **+rVrhad &4MQ G(F( 8ñ1 `0L!|袗^h9Z tsִi}Dp ,a3BdDîi/s)"es|A.ӂy RTWl/<+6F8_VllIh:+2 6Ÿ\7O$W3{Oy.!&UgO*0 Kocg^J0;OVQcIa GAG{-'S|()VpHs`)"tz%aNhnlt45xs:MU`mE-669kBJƥCRr5Sr=LH]V/.: q"c~:;?>3Y/|*" >l` URfios6rBt:(y6N&/A֗ 3l>^qx /Ty#gs#S$0Z>y[gKb,1;\]3K`~.Mo:{ڞֲn!Lj]*@ } 3f1^~2ѣ>;7k{%huAZ7V8 XE&:|WR>Y ~8AW4*`M^6镏3\r]~ח߿D]^װ|30..OkIP[V, ]&]S6Z79j7藐&\#IڎQ> u1abSHÞ N]I]|'_Aoʻ(."U>w A<مOW5L#5{XyF+u1f %&e8ɋ0V{zf)6Gß̱+HIhE-3y0S^üzS}`jۗ cwV0@Ք}e}U^;bc |۞ot!QLqǍ:*^&KLrJ͙)0E5;=lsa*Xx iK4wPb[ā4?cV5*Cz -I/;p%>c}r:C S̊: Ydo~Bېuibq/{i;4 \hA!%o~}?j,ewxv/PwU`f%րvj=l[u Eq{C7}"r ͩ.g\{s@$žR "yyhk"$yb>U]B>)8जcB)7j<[֑aYqGkZ S.ED#ilգ>Xf`V%Q-mD>6v`W^-ÍTT2oTΪ*ejV J֌ hmd,^(T4PK>T%6ÁH AHRVa S띱Vc&ye4zl5ͭ&@4;|l+﷚jfVj'޼>Z.V7ZF@u1ڣC.]{oF*~wvqs&w юF2$y_5IɔDJDZ3#Rfu=~U]]Ί_ԍ?erv?5wz7mu}ÃIr~~f{[]l۝[z,X|ބEjF?=zY5k^uk3[ -Km^{t 碑t 4LzD5~ыRCӻSŶwx LqY̘C:,Ncʠ+җWlMBV4TaSWοw%i%YGݴikR%e0vkKA’&8uLa/rTǾ}17,?ܻw~>GpB4<DCE2nOr^ހj 7/CmRyO!O٣eYv)ǰ~x@c^ owH!(nkhM|VQCs7$BּkR\ڒ^>|^׈s  oAz5v]! k 3H>kgȾGxIT_f8P/ư%P  GG B\2PkעpKZI߂Sr> e|^[z,~|q_i }?^axӦ[?fT]:m<<,aL 3KOY)P E3=:uo+])y" S0 &1vnVej; ,߬r<HOSxv(yQ1my<zjVWGE8EQ0͖͟xFpVmJpajx |WZ"ܵ6>}!ț\&<}(]H3o3A#Θ5 ^,xgPPJ8y*tl½YΔ U_KEГքL`bn344AR鄡nTi"ian-wI[ݜw&V)oJ!ݏ&`T\ptsTat@^9/Y$.NQz69O8W|uW;f%_HUU鹹I3g|/򽫛ov^T_jY/4B/^/5Py7H⡫[W"Z"uF;Q|~9i 6ISa1 :1,KΣukfEB=j CT"$ʚ}Ъ nR_B6!kU LFCCϞ_H4^R8ELv_|@Hⓐ ~V |Q! Gaj(ZVw:>&#mȎ0BPvs*$ qoqOzĵ^6yE0k;n|ir%tdKrҳ%,+El/'[c&jAsxZ?O_fW?mɜO_3oW IzÚTpQsTzW= 8b`CH["{iZK5_3%@q4q.h"&qC@G.5QNXkT/:c1D" f'"Ȧܢ%c6KL23恪iQ/1WX:lZRKXpONB@3Vˆ^FcD2Y+냉ԧoDDL ` Xy$R'~2`J*[OiasJ0E~Pմ:8zu+؛WmSY{;f~0ɼ-T4.ɀE_%V64n6CPbqOx; tLtt6.Z9tl_<5̘k%nS|S`o?KĈ*VzWQ".XJt &&VQR1qV{tz"iZ+IʦX6^AOY)0t=۸R dmiUozgzp{z-d\Yde-(B*F13ɬ`_vr6kzZw8^rDF81EAz֖xT(+#XS" l^jq3OEBA5#z0w Z&njG6vS+̧EXku`ps}w` J(p8\HJ a9-Pޔe0Sog 4ygy{ћP!Ml(puj7zA͐R .LWQz)͐lw{ͼfHi͐fH/^dff(36x sZ;(1lf% ,J@s{ KKB9U >˽)7N.ZY`eyюC/\POL&b9"hd0'ZK(Iz yI"R:" z`H"88fSbP A|@N@`kAc}9@ANXdɘV!!: $H[XA`Ir,aK' i ֧] ZruЫǗË{7x3zh÷b$(a~߬H=uC?DNQ4>c!m E2C-bJй쩙v7=CG=no?fDթ{zܻ,L}.p/ќe"[ )/GFʂrܧ1&u%}HYs</-pR/Dr4 1WYޥ8, _ߦ(bj7貺G:3="H(bޓq+-еy+>@i %UӸE7Cek%-)rȝ عN& $C@H)D0:< adlp26xu:+۪J`Vtb&$H.*Ԁ)o`Cz[&< icq;? L|eLJ]=-9>>^2d( u:-3-l"ݔsQ*&?vVTݓ_yëEtQ8 2 +7ETwvP6x=^lkapڙiH4iQ Ї0QiI>o]+G8ճVn I9unBE!ibp^,Gw5.9 ]17(;7tןκoN_}>^u7g]LTWp|;0 .囬Am /D:}ESS|TuM5%%{|R~ټBs5' J?v_~5{nN(t.g&hŽ$t%v=_2*^&*,P ;g%,D41.&դg_8R:JiY]2[aM̰yѓ ȯnxb,YT`#)E|WkO?1~Js 2Z:la İB9^HT^R*EtQMZn֑?g9 l΃A&i0MG60R. I94jcзza1X2"rv,rveڒ|'>*t(]'[[x5uu [ߊrV׶Dn44یL!2Ad؎ۄ*mB&47RKBdLQA^c\CR0ayǽ9mQ1ik% bS"7XJ!fc3V: ɓ.Д\[.(Q)3<l)2/9)F0R\Mn _첻j_53x8?To0ܚ~{|[svSgikDp IQF)3Zd%&kw 3V r.#W/*gz V4-Ҝ5hlVrgW's]#Iےx^dA89Ex0 xD9I~}:?sTN#>kfd >14.ݺLFՆ.18p35Atack5{#1B"^(`9^r" $cLK21%>OؾdV ]\9+A eȃICQ/]Lfb;SzxAg}ȱسw3zfг*ȰD?Lyj|+ƫZWG׼ίUk<؛pUdH\NɆ[^?CjUuk|.[K# 2?H'5|'6uncO6=mcO6=mcO/@T@iEeRr@%>&ryXXatw:%r '+̤⿃F%>Isaw}R'lb!V$"aK0uf2 Ws8: }m_%P0 p+:mqpY@\:$UWS>\7λ `Z8۝_fC/MgdrVRI/EGӫ⣌T%ɡ[LX~8Xk8+. \ɏ;EdfmzzQ>8]F1| bw~Q,]0 _O8 3mm? NXޙ8gNC0fU΀ } s yLpп6z2w՛XvVVZ-nֺ*$lI9ϏqLz+vEΰ8+M~+ .oߤϳWY={:LKH Ae#ܙ_1hjj*ԺUƼ4yzBs5' J?v_~5{ܜF).IN3^~`_AؽIǕCPx*8a]( &مZ5{ GjQX#- -%a*a(| J=Yw.ƾAF=iR;w\4~׭1",@ k- SD%ZI\ʔJV۵uYN=G{8@Ӝ>52H{_)]bX`30,^yxcxJKF.&az=}kϭFÉ%Z,.g"wn QƯ-I{RxS֢=2`o+{HlT[apc?uJS_F|GR;EY,F{pE쿰E5Gi+RI3]wSL'GA:?5!LDfث1L(3b^Ӕ6s )֌4L+8.q > ZBY Fr-B0#BXYL^jʈhA #(H8 ݰY1p47qh<|6h4}*|t[VZDu>3ӋB(N'}O+n"vA;&!0tD +5 '4P(#7JNcA3Z`(3*hVcAɋ 5@T!07|q@uK_mu"^h5=iQ, fwtG#LTiK1Sn3Dcp;,\hȝCޠrB[0v<3zۏܛ)GH< ho7\m"t<_A]۝q^TE'cU貗%j;z!_z˟& }px8f=D)`9njt 2̖0[aҬ[ =VK;&bqy`AuO&t>ԣԦ}*@6t6M e6='.wrU,c{ zϮaG'[)-L / By|,ޯ?P`wU; V&ʎ[+RlZG֨f3 1L 4)kon֬pR hiVTZ8MH'Tܹ :IQa{ېW@O~%o>=՗|_PK~KL.w=[}yKDs4 tfv 7>. 9%ا7>%`K'EլɏN1qs7&^Tx0)1L9S(E]N1Ζ9zl 3PG^]wϓ_BKT d\<ۤv&2(Pix`,f(1xf23Ń/YS&(+k>M#*@-QPj@Gᘢ A^kKyP<*NYk1C ,o-KԞYCzF%2nyEcł5IJw4DW1%a}.P STP`L *ϽOR)PŸ2hTЎsk&j0|#F3z9rVvi@*7f25JxLT'"1MĈ]9C"e`T<j[#T3V{4,/^/(Dit;eR\|ʫXzaNF%Dbc!)@] ˕)ΟI2>9[-GpŸZ҉"7ߔ8!ZEINƷUeBS*맲'ն,@Gn ޯTsu &}x-sF-YsAR9EC!^ q,tJ`U Jn)`b7_еùʼnM651ӛه͵ci~9ltX(a&wqKpb&|hF&J7aV BXRYﴠ)$G` x"^EBxQY,"u00f4%D*@pjT,e ༆(ՍHg(nϜ&0V,!CZC";(6BwLYU+uOK?-ڲy ]8R(BGr4#-U|i -@ LqȋBDH!)pHLdC̜ exkE}6Q(A +54rgN H2Wq `BT'Է`Yg~"]?N,^!2!9b`S`9BQ8UƏ;stQy ѻSy,\5{P{0d!DN>NsEZFTVZ>YjiB- :g$V{"'f֚HK-&܊7!-.ӵhnH{$rSo/r墸KQ@E zD Xy"+P%Z3+a`EweBRhޣ HGw`+z4{VdRk]i.ڟ2Lvn3ٗ:m39ӁRQNE$y9y΄+)74GE6z?yв2Af̺Ch Ώ  tcQmWn'MBR}Y@l!MظYE9_⤳rdV;Kng1ka8\i>0{]PP Ugcy{D3wzӱٰmsAq9kx`f5khjzox ہwRNDO+^d8k:Y#v֝fҒB4N:T5Mpu (/EݎIk.Og:}0sdq L gqrs[RE7B7=k훭,m:y[ubJT2SCPKJ_Y6j3v2\6^8h YWРSIZ pi|ON;~VD﷖bԷi IsA ! f4B  .2êv28rۃYnd+v|wXsz0GƮ]-Fc+ {3.osjK!SZ5q߱qzȒot;:;sJ8<;_OqlҼK\cI8D(znJy"<'J ('4 6(eD,B U`@\+ 빱1$˦$_ <#:j>1fJ*SaZSDi־Tɟr> &w$ϑO |0.e?۔SOi7?[wrb\Lh\h9!=#ڽdh:^S"TOY6ߊ:w]gڌVfookXhqb D;MM8/o3 RHF%S69k5Il,o;9ofW(È$H 2O}Q 5,N)}+Mz_d!d|[U]{~3*맲'Uk1_Sy9W!eܨ%k.*1q0GuH2DRkB2N AR-lB&KqvHH6&f4O W=ڠ_N:0],<J Ece *!&$&DM؀)og:/T;-( )EgJQ!B$^ȴkxD ylT6H: M F$AKH.8!(Q{tVQ,OY!C!j_O3zW&j㬪vZfYؼ݅.)tA Ѕg}#b$ 9bAjVX@DL! 1RKlTKZxR/HD:υfB)jc,oଶq. ۻ M oZ6&Ζq =?l_iγ$}_kő/G7c_wRi~{3"]69 ,ho.M\r֩(8<唦 :"! &);_uPMW <w13L\ϔL)%i 6׊Rh4t(0"瓛OB]X)*EiҎVvlnyJ9DНgZ HiMhOAݩQԿ5=kʏHaenfW^'J&H_H`: 5 +iWĄز` yD#W)FnA5b%3ޛ=wb җD.7Z%ۋ)#ڒޏi e8ǟ^U.TO *~} 18Od^+s&>8yy0J('욜](dm3[ n 80o-Iqztr)'Jq  q-H('Y|J|CkVw%meJf#zW'W<* ?4OjoW\;*U O{o4kݛ6ˌc\_on& o#( a0 ӑndPцLTo/p18| $DmO 骮V eU0 q8l`G|˻>Wɚc^/YkݻTHEeesFc}SJ"~Nho4݆x`ń@k*Q'1}X8KTǿo>/߼2_~:D3Պ{ D Cײy\6]&z6[X\B~ m/=?9L.)Ҟ!0MG60R.*1(u /WiXބjdjx\Y C}k]w+jnv^̘{.vޡ+Ӹ,a`Ŷ׸ w'-/i>bb0C0Sg桄3%a3 $lЖ:}<;~IEΎÕf6BWo׿ m\=P)em?L\ؿ"i9b^qz?EazQJ&R+-R9sJ1?_NG |r .mg)ܛUihcn^({ <0啶vuz?T~\;i*?|(p /۪x)4z]k?u;:*=B٘qV}?5E \ϗzMiWMZ3ް`F %BEGpZf0!`dZFMBy^NAJ u0Jehv,γȥ^x' nղWUWU7aAȥ z* UZĀ.&t1S..tt1*zJeն{JPNz˺QrVVyW]~2W+LgU{nww6l3gz!Ь.aqyCWLr5A^NIsK;ZJZ)˞)Qdg"L3v&Dؙ0w⣱qN ;eRLI3w&v ϲ;zgTÙ;kZִLI3w& a$L:NTjK/Tb݊W⢁ KzE,!]Xnu OXTxU #~ IUp;oB!gh՞:#,RFXp44LG+T|SS\ps2p2!K $ۆDrk!( P,rfBNP=DoS2" #C6r $A̛/X˃fﻯ͟3TSӃ9!ZNӃ9=xNӃgL|^fxeη|9nη|9nη|;ͱ|~%]/xbRO=DN )ͮP0 ͒^cݳ}cוos~,HDKNz:t}0y!ť stkZZK ူ`u^R[2F )VFŝBY 9jt (WK${d:Qc(,<ukYK x*C$I0}<׷WsC7[eM.N-L*LtS*l bG=V*lrn[D#@aLKP>;ppA P1u489H;"E Rr2[afXT+`4T 0hd=X(F̏ɓ/e%VU`BӔPZda:g86 Ƒ)iK]-'ZfF smӾ,=;PLt] 3Т8_[^ Y da3C#^iS/=xh}.y8V ZjZ] I9nu62B!aセKUzVZ?ǸtRanжxf-On?.圾iNݟ0gWN+o+ 6u-R `Jܵ/M X.sڍ[K#Mcel L[ SVaXѳPn=EcEf gexPQC{Nj\ $g߶S@F+XQ 7 {ZH4kKJ%1Iv궽:RTDpy"c1ؔfaXY Ea> \rMF6ax{/ouHnr?V; #f:EZ9j7V\ 2ZSg}j\wI /=}<;S"gG[3x+pp߿L`7.MW&ؿ2iՆGC2)0(%޲ jg$8%8e`(p*.AEp!MoXALi < N -.;?F<jg=$aXچ=촜B8uCrq~:Mp ڭ0moh/aIɧU.Q|r2`!8 љeFNKw҆L߮Amt$ RN& *f(t-!dZx(b&3Fet[= 'ӁkbRJxɋc"@F` 哠 gӲPK~\*v]\Sɡ NWu=U '{&7{[ONݐCݪV\JnWqLLRlmk1K.HhU =6g&ZzLJmp6k(e/5x.-B'OL5,5 nɍi\M\ğ=P`yПL_nCJ+3i EImph2Yb̺,%p:ǔY4 얳 N&LԆe*dJɁL(!aĢIب %0+d,doC/;dɘ1muR pD.0D I'͍L6FC!B2ְ!IZ'i==Plu<\VAl'JVӬ0樈.kۺmYa/% -ϽWVA?cTNPoLmDIK[I͋cV==:_L L*? q:kZ!:MԲB|3sA"ܐ<$LEɌU9%ezLIkZ*k1:C<YhIR(OȲRb~5E,B0_mmלjm8!m- y$jډKO_;'#I֡' !2O_&c PGT{YIbʡ̕B20οo='=G(=G}7^j\|&{)\r 56<ƨ\fBy lbY/R5NIUr >7,N?~8/汣3utkEG[|mwG11A#(0 X)_U նZJ\F+KO\F ]?b `QO$^|uk'j% g*emY~pdʶ2sB9sOxH0dͬ&0UkRy8J0>٪,6||޾Ϊ]r&1M _MV? \LM+z'r']>޹2Ԭ'+zrqxfבNon|nlʓa<ioqw},yFOfg_ulbo|S|"t'itĽd-Ε'W?k%ӛ!ї囌~XQe,cf]da- xi||j3?e%YXA*]* JE~M'su<~M#vb+U Sb;!L'CdaQĸ d@Zze *eL&gǜ3[٪蒶}Z݅`:DY1ܧE72X!xmR%I(ghB˻ g]%_yGUH*.x˭ F FeTQ"cÓ$ LR#:H228-ai($ Ӆ*.'-8Lw&ft(IGU-_ɿ5DJ&M& 07Gž=Tk"ƭBȺh0xY*-۰?}N&($/E -'*x`2^<&nj]A l{ab wKs-M[#Z9d9Z 8*Npet& @+;]'H cɴodJmYQk;QXŸo #8g칲X|"i}=c)q"]so?8L.|Ly[0޿gu,Ԏڟ\w'ox-slM %s`1L]SCx'tWG)4p,y KS'LdƦ:+"H?Ӫ>Jú3Ò:?<&;Ћ9zV%s*e]ō-8OC&@q/i8+'PjV|ݘI_fw~o|w9=_x !̱̹AlFY)("F|9;%ii-0ԓR=nDc7nVY=!QDBŲ'}57*^lid uRbBBJNJoqʽi~̏xo4 Ŕ"kg{qF_zû|?' {N:" LS,y?`w/; `>;]Z5ZZئkߢ_!ڜrCwħ[[\YkC/ $*:WruHK^3*U{ |՟5QhV%kb A<4}㱅[H7н F9S7Z"8iF]?z.O6!֫dHdSȬ*'O_iۨH1hH*W8j9mbrJ:tF#bVLA:uY/ Ňɍd'S(%uBp$iuB>]rS MV6&izyR )[6g]v{T6nE  [.`.${?!ٖVSdSo{V__LYDJ6+.Wte.VYA6;:@W R_V!`bGM~jXJ;H'HU\>ݹMo]gTE`3Bhs\JT: v+r:r\犓r)p(ac< A!(<$A?y.&'Rm!2tN[W5h"2IWr9lb[dy\;OMNEoI2+mH_?CXkyXa Oz:xCۀ)i]UꪨQ1,%! jCen( 8;NL,Lۋ2m?nSTٲܑeI$ur?W@TOH(8>p) &c@4…lGnɎmG>g#v'vMI+z$ܒ\L_v]>QhvN'yodf?<.z{ }Sԝfvq̼qn|[?{m@ןݧv afdY7{7[;$>wN\Kop\-w3txo%kAS<ρ̗%Μgf̌;ƼgYgW';f |Z''A&XpWx0"D1YZV`SFS2 u8gvZM{/v|?}IsY )auQ=߆4UPg}`u0k3F)`jrv6SхK?f3 ~7Fh2 +x}f]jͨ3UL Q:A|V*;LO!YF'VG/HfP0Å^6R½8nz7*|~J=U<M8ܻM* }λRJtO;{nyQuy3 Jt*S}R$ZBUSwqVcf0NIM̻f~ד_bZU!D\)0?qPUVJTj7(4'W`F\CW@-C{/8G\_==F)v.#Z\ctH\G%3qEW4z%1:qU{0*QK^\%*WD + ~0*K塈D-g.3,qE a#+퉫D-.\dq +&Rd JJ|(*Q],Aq93suSOՅw3t82WgG$dZZmA\a騳Gѳ#1ΐZӔK((H4H !M%#gh)Bpwo$T9pȤ Q_1疪1*h#1M_+mKxqʌ \xJ7XKcHc!KDUPIg_V}ߐiKq߹g.řKqRg. kZRiḀ8s \3̥8s)\3̥8s)\3̥8eTRAg#s)\3̥8s)\R0̥8s)791v.řKqg.ř33f.řKqfȥ8s)\3R.̥8s)\3̥8s)\3\3̥8s)\3̥8s)\3OCˣ[ˋ3= )-Qaz0/aX^_p%7ih 5(§YnoGfOhC?Y" "g+A(H8GY4cȶ (86j<[֑4.HrbrX+?B"mݗB{t^kCkv}#Ҍ6w>3Vi+:a7E?d'Yvw}~{l^Z)L GV\ܿ{ ccd[==8W*zNvwP`HRVa |LwZ D佖豉hj4BZ"Z;Cpzߦ}xr&PKX$NB<%5"HUp >hLǭ! bQ.(qCvYI(JhR"Z6L[#g9ai$;Lz S֧}YhmB<)s>.JjX4ذ&epNhKcʞ]Qa?({Ryy:2p,xm1j0 +눰6E  SL!>P2WY7=)Q[ P#32x5sf"%ck,QNdak+㩲, q[u'72})ZOlI4|zz11zuN4畚HxOxC@]ϳ#jrv6SхK?f#Lmn} fBo^mF-2 ٌJ~f_ D4 hIRO!g v,i^..$:tƒ{qh.߶mp_$OE"ޭHvuyW]*}V*Ni}})rT3 Jt9IOԉ :vZ[N5w`5vn3a?ɯ^"Έ  ,s頠+8QJ(B /lP騐ARLEEʟ01{-tUO\cgN/WqG'*lƘ`N):]sч)%?HݧE{M\ R[0S]Timw1W嫿:x}u^_k?g}va }F\mFZ;BZTJѾY8$O-@*)wЂ *,u&eHs_j}?QF* k5Z u{Gn\HZU?6QW\F@w1a%ʀuQn.j 惊KvEmFmzix= 5wב k~Cwس|IބAߖg:˖fMŝ/iBcV)پ}~En󝜛--l6 m/|Ϣ RSxBJ/) @B tbXxlاҷ=;(j43[}_L1߹Yb:ǘD}1jOb6*tbQ.$\.%iI8FҎY | \KPKȳ] aː+n#$Qur#c2tss{53!@tk2RrG3;r, RSJ4Pέs^cH`:z(ie3k.7j->MHFǤ"#WfJb,#   Xĭˏ傂{:tWF ~1+5 G(F( [C0GNgetKch]VMϥV#kd i$8 0hMS;.1"[Y rH2SrRWJN邲^Ÿ3oIp᎝t{r`%gb\qg<4b%b}J; ߟ&/̨CKs>E@_yDȣ}3'lNԳcH)D0kt|!ahlx41xuz5?@0X[i%gMHtH0\]+70!]w^yw$j0d0.>L4߻i&zt;f;^pG-yy{V0:J40Х'#^<'wN)9,Skκvu/a9׷|y~p:^Zo0i$Ie#f<?1hkh*кͧ'|q i}R鸥AuSZ2;?O~_uG.eiS?iVgUM++@6U;4eKy*а]R(@O?ݸKlE 6],|cA);M°yl(N"2-X|0H{7X 6緁@VtQXͶhPH8(@BYN3]X#483x\ֈM"{_ZKI8Z,C& 5S>*H @sQh"hLZPy֢@s!v8)j&Qhq@D  438+ 82.ێ sСty1q֛IJ2ߙeI{K3ŸaK](a:ГƟ'?b7&NRbܬ Ű~27FM_wLDn|sD)%`/={%%-8/jyk`y*BTEӅkO]R3DZm F9Ixi-NrAx%ǝ! \ ^J39jWZJ]i+vծڕjWJ]iڕVjWZJ]i+vծڕVjWZJ]i+vծN]ծڕVjWZJbh`Kڹ-p1մjWZJ]i+vծڕVjWZJ]iH BedfqMT].5Kkعgj[flTT}^z;&\۫>~4ߌWq| ȅN24s׀591'&MhS⪺^ glMFig'Swׁ=?~ikJx:k֔z.hCL$]TItOZt YRrۀG>}{_B>.).^d _?e+zyۋAnQ?e٫o%LYBRVOㅤ,{Y˝:}^ﶘ4m>&mo> .1ZV䐤^>mn T,p 77 iL դ:B/LQsV !Rw'*eT*^--f"ZS%V{/gу EthdEq.@N%eL*$ <#:j>1cJ*SÃb?<|ha:Sw:FKW[M8Wv!5Z8'S㣚'216ҫF0/mCRָШH:h5/yY35/p2#5Qy9M 1( !EbPDKA5/U6F)D]9C"eh)P5rL6/a^_a޵dK2׍2[m^f5U[[Kot7O-lO5]OWͻ m8jKtzoc!W6&4C ]^o-|6R n?n&Wwtf ׵[=mزtn}}O#6yp;ovwuf3o2Yk:6~s-FvK{= NjA)~1P|"3e6o.F7!5EMj/B *W߹HR gsSQBYx#%[NBckͬdCLUᩳ7I+L.{oo ue(pq'Qo_E:i;bkcKҺr]DY=>Wa c.AHBMhKM5jD8ӈ A1)ǝR @zCh@9@M]dâ%(/19=0"ALjM<(ÁDVHu CLሉbl1qoow/$Ww/S!>v 6 zxxCc??N,5[o4>GqRk鹧)&k$^yx&LPT"#fRi6r:Fq%NR^rJ<585Y;qH(:_~y>_Q=x {97IGȏG$ȭ:WZkz V͞\B}nEw1?f z%+]ldk Rlo.L̦Oڭ7ڭv7׭=sg{6=LqyG`eo:/UaM՛ԛq9o7y XA3gnevA*w|ojaS%)/8CtcekW|F3bV򬫆.ѫw'VGV XùpaZ27]J,;y)]'VKjiڭp.VuzixcPQPxEx.8Ä&L_8˔p~<c{ օ^ 2,/ 4mx`l43 %gHΰMLy=樔dL~سw̺z5,D Z_Z_&QTV/< YQzp8-qR`oCi" 8C$(eBr>(6: N+^ZFzǭTԲEzqygxR{.(sA8y.i|mۓO63|I"w񜝂fcI/iI#O!? ' !`8y XQ*NΗ,H]̶mat\`RxFLQSB{-*7F+B%)S{9c`IGjۼ%D|yp\zöHճ _Mәzԛ2=~p_SkgCmm%/5^g4O`c.7?#>ƽTq|~Oʅ9xJ* oɢCU̦:cQ} }gk\~7.Lm#ZEx! |9\tiӇ򶓵 c2ay33" eK%<9 |W!%Ÿ""u@D㢶 USV=}IT`"jhEOƠzbq6HyHgĽ3;ǯӆ 9ϞRgq'ym֩{@vq܉9A[Gq>p\mX$µƂ7GK,q5$O'=qP W2ԱL"ulH$DxN2P"5$NhlQ%)$Y .iMU`qׁ&f~S빱1$Pl@xFƞ}, Gr'_s"GJ;?):hd6}q6j? X?.X%5<(*4U{WJ7.xqu_?FG0%roMhlSpPmIBe%IN"PT z&ijbD{ruPHY"!(O'ctvEy+s9{ ZT0ɨDBؘ@D &в;@aHG%(-Gp7qFЉ"ڗޙ/%zW+K(:\axw.IۻP25_jw=v]m{jsF$EQg\zAm&)JI $r hAU6JjP#9\j 'zP!ǂEiSbTrKJ#c1q#j+$XXlf슅0Ҋ/?% dmLay??_{ף Gl᭕m(RL&^L1=AIH4=R<C,M9YM.$LP =Mb`"|L+]H8 a<.6;YEjN@X zM<ϑsN2S?"B$ [ AxGMQ8 %p) b>B F2#@'DL|p+Uz/Aߵjrۭwv:@ghv?+ZN.&|KŕMG/7wZh 9.]trjDY(mlەƋѰe*|o YWѠSa6pw؏/~k^ F}kZv ^[(2&15ڍ7$ьF4YDEV((/mz`rtLthnU t=Z<0n:Wfh *{5=oBzG7iY:Vh|0ΰAj1=4E"d~+ZGʈO{&CQ䴳FFC4R9XVq=)ZcKn' #g3?>F₺jk MܽWPqbV]4i^4sdy5K2%ZuK2__"O; <4/5<*-@ Lu>ɳ0"hai) <1T\9ug{لw8,=9Q@8F.tGz  lsJJMd3j:%* *7F+“8|K?cqzewBUjM`Akݜao2ƿAb<\>gOͰ٘icV[xc]C[~}OzSyXg'WK ;OK;[*-*^fz T7pv^vURWBMK<"⪪պ rQIu9Oy]z;OVޯ#&IέAPN cQ$%54hFpl,Y˵pAb"&ygLD#GU: {EtDf1[>r!}/v)d(Њx6.ڈ,7)R/yL*;y Cq^^cyY*IOtQ`ςD ziH^ y"<'J ('4 6(eD,B U`qׁ AsЕHeQlWψO9T:9!Ѫ:&HvDSMGy(I@(M>>Z2U>?lC?] &o\[wѡSl[yOt5);Yo#kyUfsr8׭"pi&F픻sD Dy &g F8reW}><;9ANsǹT|{ ZT0ɨDBؘ@D &вx 0bߒd|Is][y#8ŸZDpJK[KEٸV9p EC;C.oyuOQM4j3i.7憪{OOK%WE/*7ɥqjij$DN.@"'Re4hӐZ%*H۔Z)CZc")U1H*TM(JKblQAda,de!dޓ@.Mŷ%f6Ov?kP`t3O_Z&Xi ϧUJ:)ƴ'?LB%DD8 Ys,6T^3A%T6`d1tiI)r6Kl7{HbX6v`fJ@X D&W"3,#(L-[DPu?-l.'L1 7G{?3c l68nꄴ1qDYsj`OQfɥ/f+̤ٿ7.;)!${ 'Jr  r-NHߟ6IfN:i&Rbv~Cսo{%,̾u?iͷ߾^\.屩3Y*84?`r6ŃWW爌 zKOԴ͚g3#Y~Pjz\x=BḆ ~Ӡv>Ynfi.vh #] Zas|D6 q8e`żDO_k̭Q6j۳*bI29.>w}5Az"hx ńt*qMsY8s\?w2}O?~w? LYP[I&gpo=C0,547kfhS b\J>r˸'g[[+}YkW#n ';LԊteq= 0 l~1KޟkJw_ZhD6}h%^.*f`Nis[ᄟ91ڊ#@'F5^VcN.oJl'$g:U ` ID:+J 9%c"۠4F4%&$K(v^#o0TM }{6)BQfi9Vp wI✻ !UY`l˲ ƖEQ`|t[mKPf!!s0Dd_H3.<& "! g{HWi$iC8;g`7PĘ"5:eSRWuu_{6)זnrCmwPsgKŷ_~ Yy"Cv=@֬L ߙƃI 0@vmV 2^V$ Vcv;&^Sg>^`@u1xԏ-;kHE>װs1;M6oԞ9v*gR&+K((V;3Zd#xfK#ֹ9JIDۏSgЮ8\=y6VhqL NY`T9F$㑅H>HԔ1тFQ4`pH rSk4 ~Fh\͡zOs3^}bv?.:`ÖS9i[`1VYE 62(T'95rVsr7IMmG_]Ͳ(u2Y|XgXY4Ckσy}Z#L㔲K3S[nsDcp=,\-ѷ3=N퉝 "gOyxu[٣KPK3^l`&fлA8E'lM)z\yRz{vyń yio<+Tj BЬ;m૳=3V;f;PaI{tQR{v5ulimOW̪=0wH{l/jW?o fte/ -PmG+Rr-?i%[&]'6n2I|~3c+eR.*vvl-v6DDI'}TZ^zQ-xqg (G&EM1oKl_nߕ7M/^" &:<`RC9aE(h-5cުրg瘐"o'a:m{O6ڸ@=>u#籛Q#!i1;-MU{uBzݹSw_`as/{-X΄qa^8gnA$0̖93XV8dYO{#HJ$D"H*.4?r􋨦68_Oc6\2s 6^btui]5qܦ\G Pw u?0nqϙG6odG2;D!e!܎X1c2b=6MVHKDk +Qɵ/]}|'[R* HB9U 4?˽)׶-rVh*l :qN=k0ߎ&(),j3i \Sb6vCc[E8±+Ǩ0'##* < Yc0 BݠVd%MQ[ P#32x5sf"T9q,lme IX;,z&y,Y|™|UOO~!7K#6w6ΐZT@(/\Jfp5Ir;:"$QݣE#iHZK(IItY$R:"Yv8 塠vkPPtIKSoDW|xJ;eD\s K{}@zBŞ;ux☁ fE;$`  yG(`Jiڮ^9P4D &5K[㥈H[FD!bK5H ˘ !F ickB*)eV"jVD @7bKFbS, A9T&2"FjD|p^l"oY.4d1B}Uŀ4I}N~ n8 ωx '+%I{L>"LE. 7fsu>Ngp~yf*]jkWͨ .a |QP&( иbwGR'c\(\BڕCLV]5-wTh[9iv1PsgKŷ_[zVjzEխYl<{dfenؤ'"^}mgv8IY%z ^JnY~Ӫc֌МJsFmd,^(TwnǝdD-틖,LpXBh+k2ȃDD)RsUּiLγ|x(poڭ.]3)6.3"d`.IMN bRIWAyٻR`UjGEVmM;un^}煞x8\ruujݯ}ǞWP lᩋ̽+fu7_ۤNkN-|7d&@1R{P,֙dJTu/|ϼ.m)}5cX1w:T"zZ51 ap)1(x7&aJ;f2Rʃp-a[^jFmY0_beHx(Vp6vMsݼKDԴ{*Z`rOTOPj)vsiVK(9E$0=)iUi@$:\]6IN|sW HitL*" 3â _I%pDPAIcR}0ITdAYɾu-x&4M0 G(F( 8ñ1 `B8%E/r^vອiW[Z$i$8 0*&g@.1"[Y;/`^w.#߹wxߙp3:K C;9撾yJ$p/8^ߗh\2&&Uf/'IaK? f:%v :OVIcKRX"p/{<"SQ>fd$7v<F"r:BW6LݩLߞN7 ڊN[llrքτK j[U_uD>|v~|fAR>ld (f fUbzyեT%Ϧɩ[ Xc-6^qƗwTn{wVvOfF0Z)^ϦՅE b$sbn {WR}7 "XhwiQ SW 8oGi4C0ifq|6a0,0 f0bɇm=ѓ1{7钇#7JQlYAr>$dC_cn=z,Y)ΆI#]rSk˛eszkX.O?}D]^Ûװf`\*Hژ_7"`a=GCŻg -ƫ3n _c\B|qO*1Ph{n D7q)x}*NҡݞYV 8 |7foՙ+ *aBH/?Ӹ÷Pm} ,de$L3)n>yW8{yKT-g*6YcjIswȷ~stoo]XN4se_&uP+eR1EY"&z2l]ң%[Kԧ(@S(`5*Cz I/;J,} *U9*ru.>t|NmhuzeF.1;>ֳVkmX »̪/cۻVXlxWV <N6+|}[#l09Q!m$#IZck|$!鎿n n_HE<Q\B4fcMD3ֈTMf&։f2HC8r4Q8)aKl Aym(L8ZqsL(QmGs+:2,"1rbrX+B"zkOwg)sȯ5ry9gY 8|$9@b8օPhD#i`+K9 's, XYEH(򙖜eN)bxЄîWC0=Kc.,wG x"!w'f`Hu$jF;&"z{"7IUE0w_HzsLbQb??+ʰeCnMtO56sf1,(2&ˌG&k*0^lߟʓgn4&^ȉ9ϊX\퀐ہȭ[<*-~\ {_EMn;o6?'l#*xj;#4 OPRN(A;D %(gxSN)'Fh#j)#F^zrM܍p4Ydȍ$mPb (Xr6bfއ~kxs9UK[62syKB%Bfhީ+S)OEhž+LQGeJ)Fͨ)2}g2*Ov Us8ēnst/ϸ%`2M`Giv]KW)uٰkfo0C RB1H|"ć.fPف{$xR&EHܹS)h,r2cBJV#lŸi6띗^f~94n`sɑxJ5` \dj'h5T4|;cΧ̬1b/wrҎRrW/LTmF%:5D{P3-,` " CP#c{N25tsN9?|y)B R(3Ǧjn i.R!ХzT2 [V&p@XY N:k/sI-FZ2S;zt|+ݝ@MMyU{A +Qϥ [-i[t0S(zfmS5oS3eVdˌXhȺLh kEPP@ 8ñ1 `B1E/rjQtJӱ$iگ$!hʂ\b.E 8/`^7T/"mS=|H8߬3Fsܹg'ϿH<iw|al'@KdF]6b%ci]F }O/fwgYIn}ʃr~{)B}j-6L1 +Py 0E䬏>K0ugf8 ׶s6> ?@~bY@\:$UWS>ÂEn~H(tGeܧ3)컈k]V䤤\%XWrZ>UFN*mlbqdbx%I/&5G J?߶.UBӻVz3”xUyInj,=}Lg6ʶ&k4(6y J3eS,`3}fհXLjA7gT#V0#(6cFL36j.GәrCw2Ovݻ q!ծ7;@-Qto3fo_l=l3X04d) ߥh&k7m)c*I4˂\2IE9K%շ.C7/; >.g۰9wJ؛ݫn[9`_k׶jnoV>*}K37Mhrf}[09!m( sA iU,tm]C!;iHYGOsԈU߅Xu'"G <196"ՑD*&|)o&P ߗiwygЯC2rkͰW1cI7QfJiżBSSMAĉSY.Ԕ(2h{ƽ6π%ުW0bQG"`"RSFDD b FрG"4}5jgҁ5.z_I2Afl8e+4[܌.63-3Jbv^[MB`B) 1V:k@N:li:GbOt_+;yEh4BkM̨3 H%/*GT KIw@萩kg=/*SٓůruA^hG,%3^FuiL񒉿cGW_e'.edLf=,\h=ȭ;hS~BiVոN}ₜك/KPKg05Lt:ӫ,M)^=qw }tvuԼhnܿj*wyh'qK{/Ҧ)V"пz03oNoI AEwΫ [].7--G-hl5Dxg9a<^wK0A1 uxu~djHwYgLGzSUItPbum@h@mq Fٰ¨rDL [e&l~ I$ kin#GEΡZx$9>lLOe=ؘ1M)HjM!SEҪ)PDx?"t_xFG^Y{Ddix99ޕ.`mX^h>"] +&h>jz?Up2C5*e^YO7iE Oz݇ڟ(ޫI_0 b VK)ZUpS}wV0y9eg@3l~)B1TR cIK:>E}`l8;gLIZȌ%ผC兏^bAH}QY*v ] %M.*,`_f{ovNvtfiM0eHː#Gΐg^jPg !ɦ;kC%MHL堒L"ۍShAh(:'|jL&oՍΣ4joLO.}m8}W|JOo ԗ?#O4/T{_JSST\H<"}^rR!}ْZD(%LsEkb'oK[ JBajٌYIf싅1 (4RkxL3;8 YU>8˯>] wO&` `uvXeȎS' [V(M[{]IQiUaS[,D + Ef@[R."9sAfq.zȈBWL?,۵HZL>Bc'P<֑%lۇ !Eo$I{ѵ0c t*Y&QAT gZl&f<:( g9J?EDu="܊h `I|V*XI}qS"/|΁cggh[ndj}c&%ZbNjrJřH1y6[t) NBR1AcDl&o3ȸ8^^g3-Mc\{\\Z0bNfʵYZ*A.msfq.xh#@؆Ytܸxϻb4DbV }sT\}mfN5. ܍\OM 84jY3SJv]:!rЙlP[ lyH{?mDwe.? ?],#>ߚjq%^xQ} ^?'M(sz2kLQz6z6>9 0A^TdžԏHM( v74દdq]]f^AE+ ]=W}M-!ݝR}HF.l9i(g4YG$6]4P4y@!шB;Uv=|=iƀ(ྮtxWWWG^(O7~zm)w_[ugOY?WH*ۍszMz~R#&WYwώ7|R%r8Sd>cs[=;S=[4~Ag|s3) (vd֪0]>tтaxPgV{m+C[V]YD&cZino4k%Jb-FJ<*!hL"y (~""Yy%H[BBurb+qs{Hg[e=亢lUqtRklz#z܎LmdI:Wo\{vt"vpaRg܅Ku0ך -ϖDh=%:D){3JIlhXz(l-%m!S*[|O"q3F*MHC&QkLR"({?L͔% %rQ煡ϾQXPx@ID=.( R\J=*^OJ '뉢w(. rWqUչu\ +g*#vvתnzsEJ)ęUkw;j[/W,bઊĥUJiuWz ₼ZypUt=\G!֜*Ë*wW,J)UWxQ{WF8.WV8s9pUu Vi%r8\F#\)zz0o?yb͟7aBu՟?J*2u0 ]X(H6өLz|0 |.J.l&%e$kGVi%)BB!%AB*F胦8MNMer̥&TVFH mqZRڅQ!IJޖ$ {mAsqLq0ρzqEI>85Q,>?ђ㥰Ei Pͣ>oǒ@TKL4 J W߹S4-;ֿZs蹴?^ tLxPوP{-HR,6ɤ05EDdH6J%8Dg4Nljq?}8 8?̳\>~c2\;ϠbRvH\XAĒ!6_E!$FJĤ=Dm!}*nR:=gv;3=%BC$ҁT {@|KPSﴷr3l)%J9eF*Xj%(^ $  1)0/ZL4L(v ڪ+R jPHv*,= [됇"qNY3ql=3J)ۻY.;atu.ď=i 'z _&15ﭓ'o׿ILaD3X&Ld[f\Qgf&8v4m۾6 )ǀ61Ȑ5ѤjbK9Lbό$ѺW3q6[<'7%λ)>ŬKjE]vEd\6[l~Hy~^{Hj\_T2 2&\lQFH*gm]]c6B:v=bv^lR}ىY1|BuND\G1+Z̪,ƭ>MEE5* .-_Vt@p ) ߚYQ/Hf"mF#((rl=".ոN xIY`룲VSy&>lf䡆!*m" IJ`ʹDf V0P@U Cj]ʭ8[Teyh;;ׯƭI}l5OH4WKG3ΣOy*_iPEs8O"3 l@1 vFiViPdkt`0sXzOY~>+lv[т8T2IHQk.nAF|W|̓SaxHzVW#<5eM#ױrm<ɟiwE٘n2>LgИa@.Gh`v{w?;7]BFDW,RʌwÎFk&h>jz?en8w~8OOQ |[C 1)v֝dpT!Z_/3XJgL~D3^A!h6BU:0r.;_\=؄fݽO(?O9_,8Tj:ؾC C>j aBdtm'퐦=vӐV)>9HzK!U-/iU-xoL I Kʬ#Pf.)fF#z 佫*6I^jOv/o{G0.=1_ DWyx5qu~䅊ru6~L-2^|ϕqG#xV?{Vn~i 1|-n. t.>YJ~G˶$;%''@G"2ɉѨd1҆RԘAB0*'x;p` ta"l%x-G@|cTZҽJfc;ZfQ|?iA?Xa8RөT'=8`<|jn%0׷]q=9ka8rg`sS0]n0_F\U*A#`.dn#HnY26ԉdrFO2E,3uC5*V)K.H`* "y6g.U9W3 ڒ9%vrYXmgbQrҽ 7}==Ӝw^Ox.{#"tAH <(j^;%&ҧk2.hu5.lm.yR34V?7f:۱}x2U^(==u3FT؎Xyޡ/d٧h+/4A&l#VKQ$FgsF֢2OYAqdN2$0&gǜ3$%`ΜztRYd.BЮ uB=$xN3.hO%ƼW^#ΛB06tӋw8K.eWX>Jkףw^d"LiQ&e<8#Q*bL&\NP8- ®IhN4qt=-,2N:Ea4 cMc9: sC$4,j"8ޝ${*? ~W ˝{ .oØq$K̋[ҨGwmٙI{SK'Dd:Yn%1qK@b3.8bg 6.㙟N">Mg$/pųV3@{+Gt%YȌrT+r'Za|H#e{8׿V>aoۻ7ʷdJ/rn5BXf6}р7MLN&6߯8/ܓwwPj^67Eyfu?}|up牽[b.8++ʮ]F|=k ox#uTH6t5Fp9V0QeI~9}Z.nperBQ>r]v͕T!cVNRV}z9)I<(7o\΍R!N5;qxuNw?O?w?;.컷~~S3J' INOZ[--39ł151#)9W 0N֖m/i,2}̹ 9ȏ -Bd_3袥NċԊKy0^b?-htK\2vN0\Z`h`]a lB K9REğ:/'~z^(G߶R cЀ*;+7->|E_"Ln$s &2< 6H.kL69kBPUP'UڞxNC1κw=+sMQ]Tj`joe )v:^ʛ67Ͽ:W 2m pАl&c7] dH70 `%Xe%뤕)S,DTRupcJ`0ē`΁d$J ѱLQL +BT#g7L"88f32OʞU.oټ_KT|^.;kIlSAi)ls3gDlQ"[ݢgg*̻@o_Ial 4MvoJCm~s$K-/kjr~-EˉZVD*E˭hyc:q 8k>&'jɠ)eh;8rS0)k/C`xzs»d>א`uӟp{ådYE=|㝓v=Jp0,5 nq!3³buڝjڔˍo~E)NjdX=<T;py{ ңvKSD2@caYӼh/{xdG h PfnC(&*EѐX"5V66`eڏLΓ zɒyٴm/]/NWȭMK66mNψ sxJECjοn歷wO9tfG~aؾmԲHwmy|xw σ=z^k}?L./|=2ncE;:N/ٹֽSjr˦vbsKb~~b(G͵Lf&MFQe,q F2 J:dǴ0*} ^ԑ{foY3g&aAG(*p54IW 5"IZVXذ@{٣`|'k@l& *=덬x0kprm"gLWnDl3K^A1j6BS$0Ѣr'6l·wȻ;<ϟ1Ni6\2YY#%j1dV) A 3ҌZ֫Q7}KWW.ExFMq_EwO#4Om u 7}ݲjݬ1땲j_ ZViz!JJCoZ4zlj vhcSƟGrK,nR>o ryؘ"vLVBA|gGuGGukQHFԙD$ic+SE$: v,!ͼ %MIL(MYz ȤB) o#DVy3r syF=?9[Yks~y k#oӯcm6nc0=V f4_ɼlyϚ`n;o@`BwڣhǪ^vȵ,BL ]QF *9#6Jod[#2,8<1c$,jANeihF.8Ab DߜɿyOz<Սzq}(ؗ˶^}~zJ3>A}sgѼE!v!.D;aD>vɂaGNK5B@f'9drBZGH-aB҆)h( +YA]qgk6ᕻ[2㣢KvCW_BQP`#ZOY S;Qҍ:;7:ټϬǼ.t|nx(RbR(=,iKA"ҐȽ$d-7.$F$ն)VLY}Jty(RSHL:DBQIUkLV"$+\TJ)gХup3rvCZhH'i/3); ((-P}^ xpNoѷhkq>MuuSg_[DAlsBtEHF'.ϾOjH)2faPɒ3)3A,o (:ADI(Jd7)mlL%9skv+rv *(a)4r('N}Uɲn2O4nrg~>Iub %^+&nS{iЇTϠ#3qR`(BȢ -BuBMLEd[F"gNS\31qP`) zò\Z죭AlFn}BfU1S+xFԍ55ZnE` `=)pIRD$_ 9sؘ$s6K6fPe(ITz&aȒNϙVttr|lXߊgWX/V5l%Ջ^Q/zqk_Ek $d/9T{kEhFŒD] EgJ^|8}،; *lGE3~ |y"'0>q 1e3 Bg~g_oɆJwY@R:Q/`r s>hHab@aba t($\MqES{R='0aCgLÍ嗅.C/wnٯJڢf;n2/}[uwwۯռlڮmy|}s~秧[[]{n{^!7s91tU3WKZ{꾎\=瘟*O3fۉ.Lrmm>\ ՛<`^ǃȕ¾nQ4x @?6xVܕȏ[pSҨT\;df q\l-]l>5,AȡIfowxj{Q^?/y-P#"2!6"]+'A )9 }}&:6HHhk-zҚB""Rd ANCh[U6#gI֡CqUm[{ˤ0LJ$wXyOyix̛LŪ;Ϯn~)^ OHBXMyW*hF"2ٛN(dl#rɤ6̺wBKF}v˒ߞrNғYRP/ʆvvLVI$VcS9D9dF&09 <9笊 "M,1r֌=嬗߀oaTWY%b Xgz%WnJH0$TERQ_HI%G~})F0@5B`BFR;Cb5I߆]ʦQls"VNɤE1םm5 VYَl0x1f \$Mǒ>cV1 /<(@[,DV򮈚:-A0m?~H)wԌE)| *4sLٰO pt9kHdF t"(GP<ذ #␡IwՍ5Q|Kr:ҍr=q16bպ};UiSw7(ѢDj~Uu edcJ3/VxŃ[]ϔe4}.y]@ɛZ"IBI'dE I +%`"R&>InD[tnS+㜌DRBx@'2$|ڐER&8 V4#g+>Y~G6# |0J^Ȱ%Ѱ)e N6&A&QQ(iW7$<]>պ U-f,).W%D(>˿wlD+ O f p`w7qǞqhJ5??=>ֽ]+dRʱ:a,ZC)ɨiښ;bdDA E,D,!v•yʖ*e&wb@AzSZTW|5ZU+\!]V xg*21@pCYe̞čY@t?Կ!7ͺYrjYM򧕄ԉ ϟOsbګT;y(NUh1c!A-,L3~z?ϟ~'Ӈ]sʼ1n$?=5&z8w<`jjjo5ԡS/|yj#>RkN IY{/8?MfVݿ\?zdxRAWUS<՟0,'D /mxV%{A<l TOs*vH7=vyC 1 3 KȔ@QFA C.S0:`Uv:9 :PziJ 0fcx 0[Q8iJ є")z/-G~?r}0NA7o$1;o>^Cȥ{p:1L֋0Ri61MICuc;jo p}O־ZOG_=f'y/?]px |@$-Z&icdf̵7{jr#@J^?{֍/ $gx+}vhZ,Pl`%GwxtlKlؒ}$u!9>36j WPP ۞KL1Ȥи Is?L"85>"劔jVpÌ~}6υ7Oi|yYGt5RToR0eD2mM4Eג@/W"'XyټS8MGSV/6Ǿ 4gZYo[C˯Į;whL $3NҸh"`Bm3W謪*B3VFᲿ,=8a)#Ϳ%"qm[q"LK;ph,q3Q 1`FE•\$7~%L{W *̊:+~Llt(b-}Ks1;cjW+:b o$R{ېj'k[?꯰ڗݛgݛ+߀Z>ʲ, za$9-'vZ8ho@m iYOȪ,HۂT" A6 c+ԌGRJ;Z\iNZ Дrуǚω4h}"'Ѩ5@֥Ȳ^gmmF2f(`{n1! 5Ziik\co~~ ǡ.٨* Om'BdOPz4b>䬄=1o]*K,D X-rϝ:{!MzͲ ̃~ZGd4襙^zQ5^)ysPSu3O4 c7w {Dqc*90NvJrPo:96vɚT2P WodNhU+ V^z[e!(Nf䳳*-ע{G8:ϘwYv"LM1[M2Ll!4-nuG&?*s5 8廤l6hQT>#V:j.F:巑mdFz IY!C{)jǭ0&5}%DhdNɥd༷1qYVg9rJ{ m sOdK 폇_HURf K/#CK8e}FU[W ^872{o%daz)4k+tlc"ג6T'׮]q_S3:$2o%`elZU<^(5pX2FHk36HV%⨹Αh,HO͖t^eF)D劍:{e@>oBhrDNiANMs]d.$Fie|[ۡAyEP|PBcbLAR2E}JSJ2$UٳCXlwnc;"T3>;i\+49l@]2Gl0t(.s"+b$kt9h'ȉ,[DFaIZ[*AFRv>k UaTz+n@pǜe1b@Hfx\f!E'!xT9my/z95瘄̐rp|ޢ2Ј BcˠQJRsP+{XՓ vlU*ge rH稴&rRƝV ]PhDpTx^ &Ǵ!nl p hAQe5w-NW鏴,9v1LU)IH =ڕuJHCBl<&$=j>dU 9,e+LVx:ϝiP9rxpdU8dƓpIA5^28wE}62ؤ\fѹRkR0C= IH@Q[eFMa<˜w_hŲxA#1*;$M$Upo5$҇-}92Bahu0,G=G:_u A[hFC7 g,o3VIh-!Fh :q0&evD"a"ihT k"=I4 <nmVܛuk}Th*%8މ B琣6g/ega=h?TF/b$3V0c*yYDq )5ڤP{]1ۍ'/g|.2%A)QB ,`RIjZ;I4zӟ ( Urۧr3'g7.ngu"ß3o -$?ʼn| }DIwWu0,5O`ޅҨlѹ o+̨ɟFٓTyBe[dJ<ԁbb3LMC8tQ:/1ԳQ,9$zr%RԁV\A9'7~ :r\(mibӤJE/sW,̾6}_-h|v~Q%-*G0l҈:}$ OtV|#$ T7?tnhں3i^̌kme]໋ob4a91gs{ݓ{= "bHۏ?^AV$ʞP.lU7be7KYd cjE3Xf~lG}vϺ5UUc%UHy5f;t,}>.F$rXď! &aw楊x˺N$v˯??~(??saᆪS/4J˕$Ie+o~@תY`pmN W{gpn9Vha~CQ_fנOt>3I+Uu?v+D6mIBm脽%,"x Xem|[3:ZiuW(a &mrI<*ZCl1M!K:+ZE-373ێ j`6" \\h . 䦸U&z +;\pcf|aTH"*EbGv6M32vWh:[|N[Rʈ$MFSQI]I!`* c->˶Q51NEKl4r+(A(ˌm%@dRh\ äȹ&Lv rEeU8$Qa Y)pf>;Ϟ:H!JtRToR0pY&(ZJk7o#w^hjԊ`&طZaPӕR[5km<~#2M+llRҤUnQnGo2KAXHo=&9E h~\Vܯ5΁1>lb >i$b&B Qp%1 _ vÆA/Jn/B):{>uJ%uԹ1+1Nik)H߽m\\WXl M׳Mas[oUc{Kf-eYnEOް9-'vZ8ho@m ihV* R{3 wPMXe#JFn5Tmi4ڎFsi*W#%l"4\!%Ahs" *ZHaI4j).u),Y[Y! X[cL{MVAn9[HBW7$K(6u5EFU Vxjו8Vy>y| As9v\&xr%0HةB[/uAI>wn"Tt食`e+mdtۑ ?~_f4 Ix<ԅc\)=+}9Q|;uL}P[rZK$|rgvMtA9|33ѳZ/=e3=LeIЙKd/WY$Q!XY- 'd3krƽó'qj/^EVYfa$UonVLzG0SyXѭϧeoxr W<\!$]֕1`,LĎ^@HN-Jae?{׶I\E'\5ałAr%JҌ;U֭.-dRYQɈDZ֣tBB(-=fEg4([dAQS6 Ϩ#B fX;Ø~'rώF<xnݡ;//>/J}?N/dHRX;fI+DM:BUuq))۶WD b!DICL9+>)_(J2NUh+9=QxvvvQiVܜ{7)sC\#W,]GgWíU9oG d-0 J 08](IǑ{#'bq䏗`ƃyboY۽c ٧eᐰĨ,GׁE(VGגLB r)IXzPM?Q{'Ǹ|ub5nsT}xyuMuW_,} <}y<3.UҩE"(gmFlPƶu./q 1*hV39 !YDeq!P*Z*MT8*=C߬e՜I}[_(WGlRcE|vX~YQ;iȞ=+ ъNDQ 6%cQ:A<%11ސ2LٝYJΣc%pYUrIC#;^G6SzL'uK'8|ex@$1PƠZ(SZ }M 9G:~z7ޥk|vWт8T2ͨ"E#cn lėZxu堇,}wAb .92$56,s($$H RX70 mNcdZTW }7i9f<=W|ަ7ܘ9O{6a@.`r5zy|57]Mՠ=N<>]iYsO-1y7]^&KTp7|Њp,?pϝdF__>_ޖ_$n^xv$/jmAHC|($hW5a9r%מޮ&Mx5ء gڍ#F3 ΅\0eWgovH]6@('QGg LhTfs) $g?3 BcՄnᚺ{R-zXӫ^+]`?,:Ϯz)s4kP lLC"LVB{מ ܄B$Bd:֖FC!H%bEq(%F w D>y=JI!IOJf`2&mhQե1mf.y#.Oi>_WT1}.>:9 ZV%0uW$^ 5k~vB"~zCBhyJ5`>z93 CQFa|g歹ÉJoAW8g'f)Gfb1d>]4PtF. i`pƽפ??-5fÝSQ6^޶۪;S`ջ4`0!`i0袅vϪ!Da$`RugyoAzo\BZGZQPxX+^b Gaj޽uoxesvx+w&zn9]{7[rMeݹ_Xc|WZ#*/,ʒsn];;z%vt~NOlS%&B DFT(mó({)J^DIf)lZ4^䘉j[)V`!K8:,TnR> M=qrWŗɓ/ J  z|畨X u=;fWsx+]O1[]®uOoO.b^Lx2.|8}E-`ͷUJQtvxN='n@mURҘUi!49YyNCI%oKb6i"Ϧ0e Yi)" @RspEe,ul89;jow>;')X5O'GJe-cC.ɞb0`r. Δl%EiJHdLQ%)m - Z2gl|Q4x|gT A͓^zz1?%Kəۏo3cpǧk/UfX:_M5Ͽ7rMQM@,'13qRPfAZ B1!pmw Qm}.Z*JnT@#[[ DGV%XZ#c3s#c; ͌bjcnt..+3^$89_v~=OobON~?9~Fl8~do3 A׆L̘T&QRN); av|j4iDUMm]S!±(*'*"9z9Ns*lvLUG`7*{ò_3^kG[d㩟0`& M 178Bf$1G'=2qds1\9p WBf'#j~<ucD;"n܊2`I6{R*ZPR̙|q&d'mu@djcc&%Zbف*g0c@t|JljvkVlG??jd\4,y*.Ƹ;.nh-~D19Cvt2՞dZpPhsI~A3m}T >l¸Fn}&',$fEŘC\O~dm+[>M~vv~O5?~7|a8t ^ӗϰi ZN.3QXVbY],eu.i*@eu.ŲXVhɎ],S.ŲXVbY],맠>z LW3]}t>gLW3]}Wdt>gLW3]}t]}t>t>g:.v>gLW3]}d?Jdq)xΜp|6)s4k)ׄǁlLY$/PO㚫dI0{]k](^Mb:_}%9_el\*¢, yPA"5 ?grJOMBN=іDgԚIKp^,nDHL'HWY=DDtҕLATۭ*%Ly) y B"R:4L/ejy̜CB'+ 3|M`58 lxʷo,Y~GvkXsf'8.kR2xs]6S$Tϫ<"Dd8 dHcG OsRypC]${ObBPK/'ʆva寲Z[ӛM,S?SH$=bM'T'6%U9o-ǠY3svѯ7'Ѧʧ_—v:i̥r^3O2gÏܔr4:$&T._88))4?_]RLJ7FY#6X`N!I\Qtq2zmڲk'66gFT l]t>Rp&cT*c8ǜ..'SjN<%ͭu ܾ_Yx;e%%/}ȎI$4B bs )93APvoDض?W?7"Ot>mV5 M6Sd8\SΣc:k}F&(]JR(H&b N25^^šq2CU3o=D8Y%$dWJ6ﴸfc]3N|OJg\]Uc \ӇE~Iï|/q·}=]?`ng>J17zν~AUgY EeVh~Rbls97Gwqer;)BxHqD=>.M(Y?vf[˪ 0pKGņ%}^~]/=CsVI5afwQEy֫!j H%2/̾ggJ_8cY|=`_ ;Mݏ"oIBzdpҘCv:%' *\)A+e1k7)%eŤW9ƁF`ѣֆl&e rRf쨼}3ϧ3&5%/6r#"̗m5doA&|%G=ozdek`Ml6Tԣ۔8wPtzvPG.>u$R lhoH mK_xI抾PYW\ϴLkɓ0]3TX )( ;ڞ(:Xy'T8Crs: %쥅uSHBϬ)8%$ˉty`!4'ɧ$0e}Yki8S1$%1ؤ lCRM n=9^鑂$ \)&f% g ݩ \TI#]s}VtFn=璿y4?sA9[ίh9s)#ޟSwʅ8L6{a&ytߩ] %0AbljwϜIvejt(+rrQOd/d2E5xyP5濍" [Ƀ"lAE墻Ϛ|gy4wRbt>! zkS]Z4#no.cǛKcMOɉߥ68}݃I!)Ńo׸V\dJF_{w)ݛϻ"]·7ӋUp31Ǣ3KrX슣'ȇ/p18|'ikI֖\e[3FYk3S#aQp(|4zlٟnx8&iklnum=+B=Is K>}z+)=ь'P9Yev.p:闿<˟9|8滏gop4z\NZ`{u`e=?hW#/sR1&3Lk_ګ\7gU(S [)T;B IS[r'zJC{1d3Z Ę0\h\H h JH*k3|9_19ĢۤWT D>c"Fa@!həDpvT./IR*ȽN*L%&oT`1%ޜ=ZBv" z{ ;"[[EtjϬE3v|[o4̾ }{>!º9Qgi%"`;%4$Rq]%, lHUV6QfF GwG Ho=%;CI[s[ݐEPpHÿ2 >WR"θVSaw|u^EXE-qQZǻPEX./qEvzw\kA}Gf۫ ? G-uM϶?#۞=ޘ|g[ gU[k={o[B{j=\l z!esY:?Ȼ7_wXS+W"RA੒GHR(9f.9f(Yc*p4E+HyHAD4lp\mMJrK-.D#ǥsi=IR:4@T%3⩣`9i O]{:vo\9EW517#O#4\d*+<3;d${|8i1_bBNt9ĜK>m&t/mT!*T V$D&IdC^xYK'2#hDs"c cBFQ6Aw /u6F%3`cpru2t<j-HrVegvt|[Gl Zӎo2c&ֲ9{.8Z NʝqQיApnYwzt5l,śs];k>[] 6Wkd&' )cV?f7W9ltQf{YW%+,Y5uj7 WްCPJa<mo~wswa-iIq8ꖊ3 {>Ԝ?[-][9K6ί燱MU˭]氣m'R/pЦx_E5_ R|޹*Ђ'-EaVF`b37}!g,-jjyiYJ p sK/P`sppzzc4 0e YW`R%(Cn#p *!e;HC$E{R>TBAHf<$0EJ BC]dak)Tp&J;a9xIĩq!ă.J4羴Ekr@'C:Ύ!cg}u5}}%gmp>oN<|a+,pPŬ̊kY|L)>PhL=MQȨ,xCfe'=Sq@eBĈ@F. 5jyxfRTkƤWܢP%OYwKZ 86ŽV3~3~M1ηŲ|!˒Ssrt3\z[ďS;.Wu%tʁU{TXّ'˺zs!Z}Iپʁނg6h&kBQN$}mFpwK")!,zy:&,3sf ǫf[SqMdA@Bauo1\\OhTX ,CW҉}&C'Y!pI9Ht$ :ZS4A4B4.(P F$)} UyEp 80AW1Q/"0ìh <w* fx_[_[L4L"Gq3^q@a!/4hsW4O7'uyޟ;DZ 5AccNQ3%8yPڍd%D4bettK1LB_ ]h".CC:(PԵew;8ݽ.x*F/8p*3iP\h  h0*d`TP#syKƻR6I} pgw#Ttr>00B@)B%pt~x:w)x.5sBډdFEeX&&Md6Mޔ vGy;݂8;*r5gGm"Wh#s$^Y˩S\aqDZQ .88YbFx}9:ؠ(N@e!7JNsDEq%Eu :͍Hw0{{Fv ^Wq{3rPB]BpYxJWa+HIฌF&#HJQ0$8`4/ܝ)#5Iǚ[E33S> Zל7h#{g8|LmEIw\`[ǂ^QQjϤ8`(/:DXAh^ K\cI'Nc{A"\H?7$<m L&4$R+RIP"$DZX\M a{㸑We>,Ҁq9sp XఆGў4r4~Ş$[DI#Xusj6b=bSHՋ!$eQUkLR"(s.#?.f}.v ]3t#}yGM JJ't&8@{>J8-{go84gi7S]>v:4Nb)t*UvJqJq'Jk.(E(BY' $E$-I gmTS`D) D G%B^TEH] %7bT10lFΖNkE.]*So>~#q:jmIZStBR*k_U&;J\7 Дlk^ odQSL69VQڨ݊͌Jo{ϰsOqu^~+1C9%d+_zMI'%'+ܘ祐Ϝ.h/?T}&|,%!1A)/BfAZD(8pF[*FQ]!shXPrS>R e[1Ba!Z26#gdlUf+㺲P5rma"A"ng4?pzg ppp`<=y$ `5RK^_%J$W+Ni^`ФcTZU-& BF3E'`U9Q ]ӈVlh/"]j]P)Z$^ZL.+o~d[lR<,LCH !3xZt҄,kxd1Y,BD;GZDž6#gf` gYm [MI^.?Ɠ vy.Ë&ɵ,{t]yFӐȯEBx^0 fWX*S2_cV*FrTtS]:+ iL*ʰ&1߭A:~!,ץ S R}Rz$湻 Xcu 35B.B.p1:%&*/2A,fMZ/MѶQSQ"< O|VjmfR#1%{[etr UR2No<ӣ*9f+!zp¤LocVK=S|]nURgVtNdL:JNF11e,D7z`Hƒ_X|K@]+ВB4$@$+y`w(%2G G f@?q_ˏ&"Kf֭|z]=g1ڤDĠR02`, 8!krL,d̾hnXg4lMw5mH`&#J8'B,s,6ɤ|$54(Yf-7``tiM\hk8,BHF^Wolaw"to!pUl iZ?+%KKB%?rOyfl;YVjyC'51:&%+s+X ]mTa>$g 3kKNo}*́'/SV]Ez:Y-`SG'<%R ^AFmR0a<8ۛ<ܣ-wVP^[%(CM.`V\AfIn> yhrRɗҾ^ߏB< }muO}E|xf/ٳsR*Zsj/mӈ\~Ճ0rCcxJ[fͫuo|wzv~Epv1ȘcIac<)7og3^0Btz\6LS{@ ĪY9G:\5Z9j0+fL^F Y;l11LJ5\9G]5r^3֯G?$?ms O9;z&y_a{my=U*aEI:̽q Vمq7-tX N?E ðhLd0dy}$U\FQGT1%^ ][2.v<kQq.j3 Y ).j .T|]u ;Mvtr~WDO0༿kn/*t_졫ja&އ~bNE"cTzWJGǹP^E6yg2L_1_nF0['#ER31*7V y.dnFs;:a)U~QR~Dҁ.O͜uzGPܗR0R%!߻CM/n;ێ&y9-]yJFlKz6yv~:6Kܴ`wG'd:xы DGy~F-Cx5X[øR<߰[ siyzcL$cu.1 e< p^@8 \$OHgK!l VLoC`]M9&y$B|[ZڲчIJ,lZRƇw33U>K Q:ӒEM4V+WSee5(6 .j{D:zMMAy0\|$.OWdBd9~wUT |;c¸۫ ̈́xKfv2 tdZėh}:v|ȥB63DF6!SW.差2NQmVv8:^::YF}r{KsoFGuȣoi2:vu;B7~DAS^癡.bdY[ dm@b0(֕g@x9^ #$`C]f71T}XpVAq(޺lcˈ|}MYA||޾v5Ή\Kڴ ]^߹RoRq[_[wL7tl!g8;CnqE]7_|xss-zrz8>:ZgfoU{;qt8xu}v|k@Kspt9I6w4{)Aknn~? eZ#͹}nY#I;ebjyӼ&lzgtVL]YI7I@EN sKJm VQbd絉aeFFL!HJDHk\Bxm2.܌jtA7h˸s Hn;f=/n_tLBXM k#Dd31̢ԏH%ҍ mƹq>>.'1sh9 RTfT%JU4}q)QFyL!kYZlQ+G^h*Ɓ`Y׍9[e+7ːf]eNZ s)`.ΉesIl⣦NQJZ[@/h -}'G))4_]R=XgZ#[`N!I8(IV:9x|!=mښJm]hcsf QyĒ-t9՝u5VGSIG$p5٦CII3Rp!Fd[lNjqց`$/hvy EOw-k2ȶ̷ѳ!é4 p: k ]D60S*E2A(!{,ҏl 5ӺpMt}#HppHM;+@2ăI[P2N]O[rH &ƁqH\1I\Urz(⊩uﺸT^vXaC*aP J.ʇ"*캸T(A=$tU Ƈ*wJm9# OvĎX̺(B/vgE]) HmDapmB OUe@;Jnvd9&G}u+'/;tw_8\xw;;><ݿ\۫z.nekޔ#ui0Ȫ8xoXP:n33p@ϋ\zϙ)Min8ˑ⬾z j){Vw!^vWmƼVl P m1rM; ׳zRɦhх4MFEKݵ`4顴^kM:۲h$[swg?/4lC☦# LƘls 2M )B-$[PXgO݉]|Ξ>@R틥9[·8Xk"u7ye_xD<,as%#b!=0&3q:,sr }ksS]gQ3#j)&DE0S&^ܪ`=M="}[h{mesayD㊳N `?@!}s8[P hsf-Ә)cBq|z݀F8L$qiM6)Ҭkق \,kj(B~ܬ-Ń1k1 L(}/XXg!8dGh/âd^D`A֙/+~ E6wML!rI+r3l1س;<쀶 la mد8;E AR::C3:.]Αmc/m̭i|(.t-a6 &o ͎D# 0ա w5ـl$F!GzD@K-qٻѡ*_,}` Ŝ`' WH*-j@F.'084 Y!$|l^, |EE'alW(M LC̈́pd/Xd*z?rʞ-@r34ZL e2aߛo9Q%z ȄAy f Ly>6]EZ `bqn:olNj|s}Y׵*>Y϶3& m3Ska&a@ /3TeU$FHVBxZRUf&1,:I. '`9E@J$jC 52S@HR-3P&Dt_u5}M:W 1ua2,Yߙ=1ЍXżc3EvAb;9ݼKȓm1f,M&XuOXk tdD !A]Xjm>B^,Z@mDebuj W5M pr;擷%CGgIm%`#frf{dpnʤ ]Ĩɍf ?T],k~x"mLu1|E#m죜Z[ WjrQc- %ߒuCh 9*o"T[!P?}--kPiRv|r,x Gs551/:z7%àef@8/dO+4%鰂U2'EkO^O(TJ]1@reقh FNâ1[z̓ri ZmȎDЬdW[ÇPTp/ %_:)z+-, < B7]wS`C {&WzɃ?d/h4]6TZd6Fy_Qw;ox!nvC7H/1u"d%WG9jwA|x>v>C__w=9[8oruY2wZkg5\s^E g""""yLjxcDb^a9^pyz W Wcb ?xԴjzE5\J WjRÕp+5\J WjRÕp+5\J WjRÕp+5\J WjRÕp+5\J WjRÕp+5\J WjRÕpz W֊N.UtPn^*EH:5\J WjRÕp+5\J WjRÕp+5\J WjRÕp+5\J WjRÕp+5\J WjRÕp+5\J WjRÕp?pU8zNʞM:xPWh[p+5\J WjRÕp+5\J WjRÕp+5\J WjRÕp+5\J WjRÕp+5\J WjRÕp+5\J WjRÕHÕ{"ÕÕ] hQ (9\jzE+‰p+5\J WjRÕp+5\J WjRÕp+5\J WjRÕp+5\J WjRÕp+5\J WjRÕp+5\J WjRÕ5\t0O߼DVo] cp[/6F7i@/x&&G^? pSX?#Y%'COJ&Z ] |e9P6nEt}/p=mNW2ZWHWYO+Kq-t%h:] ʨ3J6@?~8:qf;o.IlcT;Ox\oxSoNO7ooWCQyrz]~0ڏ $4m>/!]yg^.w{ů'Mf>7?u_-^ȟ46oٶZ7 !o&GMKYo_9/旞]{Icq35sqz<~kswĕh6^%S1:ة UluC%M.}nd#2{sR/Y#y3<Qo3, 57kzr5|b␉+hޡopg(hك_Q׋_cEGVDW8ЕY/:]7FU&ϜWDWx=w7j #~e(z`μ OCK_aѕ}]YD.p+ek+Aӕt ʱ3VDW8Е5ZkJPzt ʳl"L+Vm+]B :r++&ë+Mu%h#:] ʝ-JW\rtu|^ɡ~trto%~Vᛷ~.滳~'۶|ȷ8{CǸ8ŏucDu9s}ߙpĹƾy|Ϳ㿾ۿz/U_l5xL ~昱]Yo#9+Fcfefz桁}‚-YR+e7JVJMٲ+%(H&3?3i"g$JX0(P9짼賆@=x9:9O06A\_>T~cOng~H>gH k) <ECU7㻸uixz5f]jVתYym6M\XW1mY]r`qXFN'uZ%C(׋ß3T]%;eBKpκ^YTZ4ܬrF9n$n"^,Yzmkk7~[ ]MVEw\\ao3ׯAv:L..fnGfku*H6~5Oz۞.e\*ʯktP?^^ IJ5;7$ьF4!.:'⟦.2cU9Itp:yU%KRӾrw^yϟ;26Y^@~>ͫ4hW. YP柛mMۼ#$/uvPO5xx{ײ5& ѥDφd:k;𱛇5Bn|zަμ岎zg5H_ wɠVh}T_lcsFIJyn"m`E93,&Qe6O^x*c'G+D+Xw<xP?P&?Pռx &{~ʞ\jWф=xcQ mh<'MڡsiY'MR"G9}d-KE\+eYs\;򘣈ގ<Iێ|R^̎lWl9R?of5l|k W_~ ͆:2^KkZ&ђ6QKIrҜFѡCFͧMH7}^ċ5dh߂|49ӁLj:ϷMŸ_NMA%Y _A &MBSĐ9s#f`A"$uUz`x`Аɠ3̯X4i$;9&#!h(0PKH!] 5A)ΤezQ1\[00a` !r4}Fw52y}xf[Zbڻ{W{jt<;uN!xX P@ƅe!p%%C1=tvz鄞y@`9>.' i8yPFIW L;\x=AL׭fZ?ퟀЈ1i cNR +84 s!lJVC! r~u/źI-:qdPEJ[vw0#:Y/FEqa=w;:W|7]{9 Y4wW} o.>faƗQOoⴾLӯM`m'7Y$)/( 3ԐOQwC7i]Ɖf#OK<qZ{louOZlalTtOP6Cmwɏ0a^mqz&nDd{sh¾mo؈x={X~s'JqS3ڑ[Xnipu9&F96+%$Vॵ:jw+S**p)c"s).xN8 V1 3$(eBr>(q8e #:GH4w*AEzE@/kOWUgg_Sbq K>]zs3A7G4P=Vq@r ; G0~afÇHE%2jV"/WVDTu@D墶 @8e}Xﴠ2' hE{b KqI$Qu!:*[4bƨ_D f`<RTD#^-qi$2!^(mJuRx4Q* YOlX)R h#)$( X8*k[E30"aD8q^A9bZT\qz\qqo )3$>=s*4N(!yuEJ$'{\<. ӎs'F$',kȃo5[޸Q̋v띵_Yњ/l%]\ jDx[i ǜ$EUEMi=*pI!;Q3 "]rTZΤ>Ygu"##T* !99~WIUՎFx:E --Xi#1[@1\~Nɂ=&9\?3_ȅ;p_-ޗH&GsN=%M3ϵNbzJ|ՈypϪøH:Zs$TW᳼*fn|/\+TN8S%&s*>'t&$u̒R[uxN}d;A讐t<6JPl+54rgk:%* *7F+B'_bn]uBx E2 ʻx1] \_ AnwM|n{y4uF;C{\+~UNe'hȜW{oNW8d2!{\vt3_69 ]ťJùHz#vCLҜmLo?\v|hy8)=n~Q"1&AGACh R((UJ0>f'(fGLfW`AE%8JtX5y鳺}&^%%IZyu+l6}AՍ~Y-{{'9ۧi &/W6XFo^ɾFQ}2& xu3 %'8 $Rq]ؐ 6Qͻ:Tvz 'd? CivYgwnPmϴ^i'HqFܳxT}A_RWD_A$2Jj8(B@O|93o%gjUT 2-f")i/+꽗3E\hA"L4֥9dK;2]uj=wN}*TG&)+QLdeƜ8|3Q 饒r RA$ua=K܁HMTOL=W{s?*ll7NþMt\<~Jh/gڼ}~fWI}XnmJ9搗}kS\bIV Tꑩ o ٬8p6k;/|$\lgY{F+ai 8d%ؙ`O[Y҈aVòiK2e6fwuwUuUu=ـҞQN5kRfRQM+"\Ԧ!Ne(q44lnzS$tBI΋:XD+T4hiB$ Lc,-g!)LbvYCџ7_,0C=Mr8S֓z^4c$p^?f{jD{I*˻;J⼣u!ACQ*TY!^bb"B.@债r^vA\(1PVTaDh.Ŋ(RJLtfP4T c" bb+hQ\1 (h2ΚtV2+ux~맠P(5y>*c2Ta9l9 BeHZw$& \|z%,i4:g5#4<%RpZ0 N 0wNIjOr!?o]*[T0OyB9"{ Xi6z!Tޥt5l тoG+`Jh۔FHVGbiE9MtNa FGC sʴV0/N+;dG(t5 gئ㐧S5ْnج: ER@%b:cuHe%By#be6B'K"y̅!F:-LM eءWY\6Dr89$t&r6QvlR<lj$4;-اW :A郣ۣʿ$5y{ pt$PsżdTʯߖEVu~k(x6lfo*0|3E|_>gtlҌA#qE{6fpAM!/Z oR~jiS 㓛/"(u΄"1G#˵V6N޴]eޕ0e,KMOU Fsk/sI-FZTS;D*=o7*Q02 ǨU1kQRk R5mm 5x~H7c Of)ٷ1^Pj)vsiVK(9EL=iÝH/:<]Z$"ķxQ Hit *ɜ3,*̀XE`*(pcPZJ-3ۍa+7ĄVDZd8g1GNKdtKch: MOu4*%>Q "rﱷQ8Hd̥ N"^iCB ANFS!Hp@SDND`"(rt-w_׻~Vtv.s%}yL$4 ޜ>mM_ÂY!=8?Kx>t:? >3 |+NldJVR_u%/zfQo~ɤ_ugWVrbΊ 8Bl!JxivcFujnE\[StrQ?x,c̥~ZY+d(D<9_|iaEGƑ{˦aH0aQ aɣ`*|8_XٻMqTu6ɦQ Qf>]zZY;8VYnq1!= UFO_>c?>ٻ_0Qg˧wpV`\z4= %$[C7Z547*кY Ƹ9qozBs$ JO ^R5FyYgfUWߦd}<@rFϵ,wJÃ&.C˨ׂaۻQW5UT{+feNڏH%B=(&W?(#|Q,%by1Q,dͭ'3wvՍf*_vb_j3(:,vE6`WcW[oftU9 {z~GٻJ6~{"~i*|=,8?K}vG}3dK9rG^/?oXYaUgI/,a+\Ui8loMsR/8~pa&iހRiԍ ), qtrt||3vߧof 8¿J"&Fy&0t0Ə3y-/a!Π}~|hS++z;Hbkzҭ'Uz`+Vfn5v/CK^nc7owXh-+ʳ<+ʳ<_'sQE"%@0Z1?Yy^ a ʭ56^Ŝa,D+ Mu'L עafZT=j[Nen v}^#c8QNjy rϸW ZBY V##zP\rH# be}0z)#"┥0Jb<):H8kF،qh<~}(!5 <*> NѸwUq{dsh^ ]+tػN.6KޟbҖpRQc1t׀t@uFfw"xd 4wFE刊`) NPDj;|k]-g2\nsDcp.tzH|zV=)ʮ/ss[f-sxR:1c8"@A ˢS RQJs #ӛw7! [fGgg d~ zWj7z;>J~B=at5r_PBғ6_puYbtrL5s4'e##$\Ѹc ccP,r'ZHЖ(!/hhʁEhE1݊XHǞ "k/A{@*"ŜhIƑCk<C8|P &I]Ww@I a>78_6!QIXo+ ,4 ,RQ#fqa ;ǖVϖZ?;4yf:C{cm@u Tz 1E&(Hf\3L,wڵ=,i 6Μ9^ O'<ĨuwJ)(rQ UE$s`ZH$ѧ"β3Kb(̍<7Z  Ā &:<y`R[>h-B!R#9k% Lr``kMG |dƒUP<†E)'‡>,9AHSl[$p9z.-BzqR)n>U,Io]d@dWwR ,CqJf6'D,"<_E-ɀyx=F[oU 6m\xa.3{`0{$hkc^KN9wbG,Ki[qd jV7*4Yn-_Dc{%dݕ,g.7YcBzJk@hs@g| 3fҢYvsU0]"d0#B@<2нWEtFh7:ڌFjA6MFCV^b JF=XC'&gQΘNiK,j,1=νBk_ٱ?S+Qҕ^+-:z*tT1ÊZhr^O^^2Ƴ]'x+fpeH<pxym'7 Cս3uDZ>Ԏ켥_O3^I'hNQ z)ǚ?yxdRqy!@p82 |L,A^wERL'Em8||}|YƢ6dDb4Ū ,`JtcEjڇ;f"(Ƥc dJddRrO9.ky] m󎅧>ҭ>6?D͛O|Z͡!ޝ"NWxWrnɧ Q!*" ]!u4^sڢjg+6ݸvSQ:![i!49谀ӐH%KR1Zj Qd?MwkiЇTϠ#3qR`(BȢ -BuB0HB`G MaabNH` &ǺVKe:Ą =fևQ?oqc ⏻jDX#Fj[-mbdVc)F"/΄ O9plzl -m̠DPRL”=%+&3S$ZҤFlFjx&:q]iaz6WֲHV_r $!dZQhdQW CљV/>^}،;6E1T؊|~Ulop nr8 F?>U#(w`lL~}8f S8|"@*::163d mCm6@2Tc($mtB!%qIH.Bok(}>I !KO6CB`F6%.-VQiAd/=U4U"8im$PGnn\02RAx"*lsBU6y/@F,tlcEjo<ϰE %^ )펴,%͙w%˝(S9^ KJΓfwLIW36Щ{%w@RF[mI3䥐ȧ DP) KC8!̭-F&MU>wy}|Ig%t;^SVWeE3Dhͩ:^a%1uQ\h//1MB^)i$"fPB'Q2"6̺m8m%;S$x+gH•+'x@H>'T6d:מm*ɒתV|l*gQ<$=Z'T'6gU9ombɏA7f)g\}zb3mʧC :kRLqXLD4<妄 9IbPIj.ȭυgz =_@M.9@r "XBi5A<9$W[;ifשfy`>\SAj2uyu`[MX[][nG1䯴≻%Vchxq{Axe:EYxi4Ab1|'RwEԄ*?&J[;vvJ}t)ЌE(| ̋ *4sLٰO pt9kHdF t"((x,fKj!&ߔl\G%]I7Pҥ'='m8/.r_aݫ/so]#NwQ>>:;kUOp‹zɫ7q^K_Zk16`֪vK~TKirSff(Dh/gJ}Eѫ#6^Fn|7/QIBg^ (Y~t=}ɋݼcZVtQ1ZP8nNYY~/׌ 6 xlW^:wP~7ywQsI^Rzx1{_&$Ot€N I +%`"R&51m 6em2H$U _C}IkCIH,νoFΚ|I;c?:]2-Wx"ԈV뤽ɑ2k4.1~IiƤ1$`^oC.kqk uI"J[YR]28\-m`NJA4^Ta|31ye~wv@3gy<-9YʂtZIwxLG|g8*%~ݯE/_^uJjx;9u.w N_gVҶ{PpWhv`zyߦg'? n 1Xsp`fuxEhϪ">ū=+bH, #-F-FItˬit(~-֕xtx>1G-\::e.{V6~qԁ ϟOs٘W ]#ri1c!-7l8~۽_{7{vO*?_}=:70+2,%ߛE%&r>^CVCx󡵇u-zWdGpA;̋ ~>0Z:|oV=5sZ#u_aY̏{C nST {J*3 X__Q\*b줋 %Ȣ2P0,Y SEMV LKVţT.Fh3蠶c{:o3> OY`+ :-O Ԉ?M kzT|\`;M6v6*eO !뼥/AߺKB;C [[ڐkHRf }D@1C."JTL>-i_vŇ*ɋH+>"Cu|Ah47$YO;G[DVsfGaYjMCv{Ο?ECbCwe$2/"##./"8X?Ƈܠ ~#kcaͲo7ݭv7pA#'lyg,FeV~9㶇l-Tb3wbZLE@QQY݂H P*+T&T@ Eh)ՙ.jF5#TԌ\DQ3\E+HzLlJˊz@t".Eq &YQo2΅H©I@e$gSG'"<g7."}I~BSw"|X}\׾"@F~*CI>zN3P)梾M[ZO3Q 饒r RA$u!О^J/p2#5Qy9M 1( !Eb>F)ȴA*FiM.x뜡2$Dy &g F8J90P{g͉aNig2sNuln\?U<~{O\4}f.{nWûLM9K2YBŊjj3u?SxFE+ln]-Z" ;f!ejz/z~j6;=ܭdG˫=!:?N0=8\|tOϋ2'SU+\顧:]lg6VίcMu˭6ׄk݌S`W9+nJy鹛J@4e>-4>884?Z`v#G4f2/YӶgȎc%%ԭǫ3h(" JuT#UX,WB BɥU#;+)4%̌q6ёuI#Tt^?/ƞ`%]֔7䌼yaOn/yhբrÖ{㇜3z"PdךY +>BwWލCQh4jr$50s=m7foUtNrJ6|'=Q#m"E*RHxb*'Ay`NE} hddMLZ{j0:/^lԇZvSˇ҇lR|J_`hB TSRD4X*}|{ _w2/EFeQ2L_~hzW^FT>62,8I^xvn:lB>'jNZ0$a@d(iZ$Q`)R1N?BlNsXE$zMG>X[l&PU[YF$I& 1QiK5S a"rH,9ƅAjw5?{>hQ>;vtmͦD+Ԫhڤ*`4}kz񯂢iR>E8(!hF#,"DE N NGAxW܌D) bJhp`2QfpR"=ш:-&΁3g}kop|TLgVl6C|ݢZ,ܛ:oLQ8"YNL=MQ(8_#S#79 / cg2PF$TcFxj\.it*Ř[T[xg ʳyC`SWFյ_}6"ۮ3':SjE5Eg\A_^{kkŇe)%ڿǽvc#nCҽLn/zn{3׫9S:o7p]3(3ԁg6hZ+Z&&j IW64 .w,J#8r~nwq:_~6D\gC_Ңo}omznÀ@5Qe8pOh$pb51.(7 1G BV{`AC'ό2hHwsLFMBL3+c_H ]gnB!ҊLnsw{k[O-Q}~tZaI+@w 'ц8ycr5."4聧 LfCO'!&t28L@8yPFIW L;x=AL׭e}V/^;#{@hDR1'D•`UGGXٮ=Uh%+LD!ZNYX'}dݤsPQ8d2OWҖp/p!  'pexMbGz=wU۹y2aFYT)HL_*Q€= #fm9;9y/>^?qWf8N8o9+N7"YGGBsΎhB qV=3-fքɽuuGw~y?Cg]].r%Ґ')g1b8ӥܿM,?8SFPgT?~Gi6&8yo rNW /x !HPY^F%OEXAYC:KU0g4W9XEv}mg=2JԿN;L|;+9QZ@gK M 1J2$KI&HVN@\QJDž9ӝH©I@JQ9'!{N]PFnKurC[K ~yZLܢ?F#.=ĠKT{6H T#QEHAJ"&o/nWJ*H.}%SZ 2ҦV 3ǢTMR8W)4X,dXx UKf;M 2䴻muѠa48b #6JcxNRLpU c$vLK$LCcƞEĝceRy NBPж#$&>&.QJ&fb^ jӎKAm֣v`wU% ,hC&5s9^[aB ÖLXQS`ӂZH(yT5^3O"qD7 IZlAُQ 0x1AwXD#N-qi$2!^(mJuRx4Q* YOl55)R h#)$( X8*(5ŭsˆXLpqb<:wIbZr,.¸=.39hHVS rVhbb99'hʐ"Z%=./iǥ8a{΃/θ7k^oUl_s&bQs\Nя,,%\kD7x[Yx|-EUEMi=*pI!;+Q}c.9*gR:B\Y_+[o$|o@7hdqq,DK VH h&Х9A@Ptg_ޝ!T.^= ݏƿKV&iX{UU6!"*vᚠ$IVAYVIs Ccg"jodwNLv]W ~Vqd/un'UTsǿr}]W,*:%ʟ²QYq^Wts[A@ue˕kwvngGM{L?/Gz=erXA~ݩpɃ0m$o "EGH!UpJLpTԽwJ*8jߩXrٯ˥-]b% uAxFLqN  P)1Z^O6>q-8de>I/Р$apǃ9 !6q#OCn?]O\9WZ|ٻR@Bd&ڙh4~؍eI?H|itȉ}첏WvO_} GLmJVXk 6c&j-[[s?db/-X-;pVmo;LPX3W9Γy\4~04J+WC~ay}UUrwnqp*r_|8z''b:ǑoGOKmFG)Bs䋒2E8RVP`|V%rf`%;ޫ /J'#<,ChpuJa?IM˫D`\PYdPK{uRRt}m:h>TSګ$23Jb7-5)Y`w5f^(GmzC1'_PHEARC!$`r$,3ƒqWB SYH)mR[ -6)Ν Q^#Z%")E-$)ky 9n %u4qZ0_FE AdTS9o|a"gmw(|xF|n+urmrYi,6wuFU_۳UNsc̙D^eJ\F>^ E۷{1UGV"Z Aa?pR)! N'25){odpA>"{©i~ *2!l>nKm^qZ:z~zF JQ_(%V B(4d7>! C$,w'O8p&ŷhY~m)88K*/b*K10!tgUs4{?jq\Éד7݉׽NV,pղԗ?-(%c妲gR`\~F^$H":^p @Ha#Or BtRfإ;d@ӎȩ "-fєrsŚL-h=Dݜ/?n2C,J] yzȻ' N.t~2 3AL.$#Oi\aM7AY+Kb!`OB:jI.VIj:x^ {(Ԃy!*4褈Q6Qw)N)e!nva"63MZxۋ9'r-ih+.\Bέ>CoS .v9\jos0YY߯ͲAY`&ͮ-+xA 5ׯpt7Y%aZbϺ;׶¨`w,fM[ָl>?g0g [ZnmLۿ\3ǢpXJST!X0K"U,'Y;]Y3};L+6iY7akj~̓hC6Dsj6!|45)b=ңMCAT`=o iR^89vR 7]4m.8@O&pB R1]ɹhd * e8kh$-F DqHR*xˍ< Fi$ aIH8(>sD{2J#(&u`L`!c&O gI崉ɸV̑f$2P#&, f-* F),oZ#gC>+fB[X ^(s,shLI&1$e1(d=$JVӖv,& \_ 3)d␃~>o2 Јi% 1VY%B! PН ]Z=&|Пy(\(ʕ *$.U(fr>E61dMu.W V&ӋI֚xH̼h[L9oQnv)Cq@ \XkQ%5V3:i tގ6AÉ D||u{Fa^F \axp>H4ט6^#U&:\Ou\xPH$HG;"{XK[[!?&tkpgİJQVzFqJHd!s6y&gl8se/,ڒ $Ir`m!a^u`:9F9\sxXҲU2A,_y>~GD(j< '#(<!4^BS*JqS#|22XP9rΕ\&2x\h8Ax\d|ԖhIm>lV3a'+>Rn^FǨ8P5Hxh !DK=v_Q:CH/u|0KNwM6a=YҹiLkS0]4T8F1(4 ($9@ uw X.PIlf֍i Չ7`!4' )MKf,شSMRMYL$q2b qI%#>@ˮ!ꤣR G1=!{TI:kKJ|er oF/P#%[%}I'%Ks27fT9:i,V-X |s-7Z9#@YTyNs ேe?{77]r ]yLD|]3'8G@Ԡ9r/FFyw1>٢.k5RJ;@SK r-3rNH?:ARi6;?]XRz%,5-?<<_,ſ/1:ToZ5LQW|F9Oյ{U͈uU%ʂEpv 11njs?Hgr;م"\h_2(n03 BV$ʞUݰ0VvT>"F K3g1k쪙yfpLҕ2fz]5V4:ηfܱᨌ>J#~@ioX]!Ť4pM^5TŽп/Oo޾)_O)3owQg`!]Iy4 =Etnto~Gתkn&]6G]O2W{}rYA3Z2zLķqYٯc':u&Z ^l^Aϫq/MT:Ua$RD f̢FZ -6, %y%@& :j6ژK&j5$ !nYg(m{ǡy;xcq`"\͉)p 0AMv 5*&d5VvZ9) ~wtms\0=V'LQn}Z\i/ Q -P(v4aWBv)o7V$€`!ب9YI*3B 2dEgٶ CBC"ƜMfZs+Brc.T۞J1pРqts7LB8V\A8,G(j7 A8y&Ek\z@*<'L^*LࠣpqIF&Z"襊%pt 텤GSn@ XNl+m]Jw go ה=CzCZ vB ,`d/VNbaE09 ';;e$mnd}]"NEc}gyuWR=TI=Tu&<ǕθJ37Q_.3)C~ayWjU>wnqp*8GON!(UrHG6oG-{bj(8*BL^PE%bdƓjj @(c42f&P 56+;m4C`rR*.+3(jYT:ӎ&m#GEȧtLovp`e O'II {1Ok@ϿnL͔h܏WU(6Ɛ{a5g ~ǮI:M+w7O[x!Kγ?iE|_zq[\^jߝwA.Nّa:(bB-!e X7cxGz1Qr'L$>c&R4*z!^@1פlZ ino!n?׶|~nRSA7z׉7/tžO֧- :ir5c? 񆒗O[2g0!_ @0)kɖAd#hr8 X^fdRzVq\*<19{n(R>'?t+3- 6Μgv`j'Jr8'BGGGg}]{Y~Z0yoAA.:Q3$"ׂDkC" Q:N,0j'rDdQ%G4)Xk"6D B(uu\IFs-SX@3rvCM-tL> m};|7u@Aݾ牠ӪuBQsů.ƽObopM4@? YYì I |9C٣U(RZAʛbA "+g4l.hTr$I*n ƣ('\t1LZ 1zHREB2<Q9{fSeSb{0uq`H]=x!YR* /U4duvP\7 .gvSJx%D8࿵e *ԧ A@slLE¥֛ڭ-zw3$jcqn'{r1ꋲ߮Jw%tmc-,U1 ibC%Lr`jK`) +֚95c;L6㌇BXA<5uRˊ -naDۻ_hЅxq|6x['Y4`Z*Ėb4GJ*`` KF(mO#P bTch0[-l BOȦr"uO#Z[c$ǢqDZhm5h[Pr-s5s{/ޟ0MkE]6TƓ?14wכfbDxF2;l lAL=R DAtJCHfD ]PȚً!l@@ԡ'B;_/Qȶljt-dy^ aV?t *ʢ.H(f aFTtYN-/8ž;;d= ݭy.GAJouY,M(LgKBNj@*̦cZq"jj? ݎv9DM:f,@CVpNm Κ tBL5e] g7J`$TJd+r@ WVrlMm^šx{07&*|؃f?nuJDuEY9$ &OlQ^;ԯ.ιljU_)Mg}. _xj_R0eWqƗOgW#؊_kmF$?=ZKwQTX 2#)"%֢+ r~c՛7˫|! %XE {IMmk&K~-?Z3^:nXjfR{|1Ztfy>jD ;||‡r \U17xe&|6zy6Z}zkk?4;FrT4rTk;#0M*f_naa0k;6sQk‚&$ f-hJ&[HD tdjL~ZL01vJ[+TnZ~q1"&IiozYuf{ut2&Yɖic^r6 ђ k[ :nS,hDo%cU! |:iCjr~λ&'e )k5)ۗ'wY)-Kpd`fw=ܣ 3TM@ZUxU=兄C-(`I $"/':B=#C2!kՒJٗ $?aiZK^Ll2F y%s9dRՓQR[GɰPlɝ SrT# +h0Fa=$V[;SX5F՟!H[+̔*nlg4^ßLzϥ~J"?r/yw~c{=SM?UThFKFߕ} ]mTa~}GXreumm$_!2٥KWД%UJYռlXQNN&xBm2&B_08bQ gIdJtڊdxMZ>xm|V}vQ9[c̅^lTwvqg[ 83]MCj!NH* }J;v1t݋xǵ~"f[+BRf-RrHm0r W̒wp6TXbA5@ n8sO~x_ߧ?9χOyu'}o`SmyMM Mm2nrՆo0/!M.f%X)4hnf-/~w¥:WӨY|w<`Hk~6E*%Va]R(  O˾y6u]nd^}2sY:la İB9^HT^R*EtQMZCo2o3wl{V=WR=Ca,?l`&U`W $9U:vqUBc4DR9N+6Ka  kZ*lfIs̖ x@"e (L;KG%awBZ%4@Kx|L,GQ%;)1n +Ď\ނLJ3`;,9^* xt;.$n:pX5% +f[kSVۈCH|75}xfbc5 4MH`RGSJa6>0;˜lo޶ycv4}䚻c[$5Ch5*Cz -rx11V;7H"^sa*Wf97Y.i+!KJ[Ygc\uaK)߽e;|5ĄdwoMvЋk7jw:On Xx^X~ٶt @8ͷ1}Ьҁ^re_(y*;lhNup9 Әs5eaXRG-֢^;e~^eS,$+H8)aKl `AymL8F]EIuTQ( Ȣ`;X+\$JKƦ]Es$t~MBӨ{9R|gMr(OtNN/P=Zzi.+X*&Ȫy!IUaPꙂJ3/ToB~N ! QOMe̜JsF-d,^(T4VouVS2juL00iZA/ hJ$@JQ_`B<̯b;cƌLZFL&Z܊ԇa1t6L槠ǣ2dz&9MŸjV^Ori? Մs]{Y՚VDW8\ [s4K2^› %*pZ|S$עV y(szIz *:u1)޵^s]?'׃,Fnv.AxUOz #h0Xot}r|͛7yE|}#s+~UÞtӡ{ׄƎ]~}C^\n.M*M-.L8b .SƘcx4,T;kH \Kr1c{G_g5.?SؗSQ #z!Uxd!R&R/5eDD:zFQ4`pHZ *衈+ȾBV\=q2rH*!,<qUrURV\=GqҲw9ֈ'U~TgW~;{T}W(WWmݝ:9 U-!j ùK-v}ϰ9jBJQ*hxw ?{WqEt. )N/iG^>~0R\.O匜{ȠbDNIn$nrCFNӄSHnwfl*:EpL[,#ri$ExGsx('!%vJ5jakYPT1 ixD܅t$1je gRCfiCvmUhiIzP9eC3*%{^q֞L :ï(8RLT2,\y$`LTVB,О/mX _LYڬJ-/f2g>{} r;"i@ʩz`eL X bЗ"f1 *5j EoȇmY>. wr VG4u,ea}4QkBIҸ``s'J΁k8%̸'X4tl!NXo6k$դ1E/-ͬM" 5+λ!o)돣>U\2g~61[ 3 NOj_/-g@c?×-HOI: m_['Ih$u: mIFN'8:VHXb?V-Җui˺e]ڲ.mYK[֥-Җui˺e]ƅ(ؘ c_ը F7>(F&WQn-X3ث3{9ei^SnMQWۨkOuqL1ŚqFiEp`Gå3?F MFhvf<2:M%ؚQBbybsYU;tq>7g1 M{^1nfYkjJI`LGA@҉_raKX=9"|4*kc&PfT,56`# ,',%sIPL2 spc+'ǩWW[?;XjD$WQvEkY>eɩqrmMF[;rGv$o;^=ygr Azj5(JJXuJu]J1uFݼ |)ҍtO׵5Mp<ނwK|#X }z_9P|ˡ;@;܃  IO'`- EfqTSd{1;-Ҝx LJHD7`h ".j -FDrE 1rAP"B+Q%T8 y Қ)RR0DpTY,-,xGS=64,Cgx >IGs\>e;US?[ 0b=kտYOF,xHT[F= K+,>K)q6j$,8.,(| f`4>=yjyAvfWi0Z4qnacRKA=qQ=Q^O sq sBqauUWpC>C!F_)X)zqyqnkއIc,/'$ðAB"sL҂☣}3%'%ҝ(`]sd9a;cJ\v0{~G_7f'l#0:e`)*ȕ!n5r rorL{4<8\BXݦ%bR}w u8ʢv6e]1=^}%j5~j9;^uw>Ss 6`o }-]_y%>7/CU;e` KCX ('6d`b)9g6cFV u   *ߴcȖsvKoJG?kYu%+i`>{OwRtzGzPKkRY0-D239 Zƕkh\n/~(e^o't2i0] >`Fm3 1)Go"{wOцhy s2o=6ۧO)=~<*zzsz\jwgw_oCׁzI٠ުOi< @;il|2 %avd1jɀMY8Ek=dd60+-F,9@we΁1>lb >i[F !!\$79]unONې&XY/not lD_˳-'E=0%jvss+#Z4]$͟P1-ޔb*U(u>~JA[m࿌?CȕnD-J.}5Q)6H.Z-S eY"go ޼c7\1ehKdEŭ{?]"K ]qVꆅyEZ`>eQ ]N|Q˫Jߙ]MFӹ}K 는6= нw6Gj"r2dyPk̝_n7mܐx48ɳ Ac*b :hX^汏2;Ȃ/e闶YpL"g@/ .T T2+G1itdRvKe)}A6#/*D sk YxJ3ȵ0ۂd=19`#m}EΗ=z-+moJ0`/txO|uöNk$/1B,\ɢOEHB QX50I#(83jO"O9E[Z, 3pQN刉b:sȑgW UV#g*+#N?MEKF7Wԣ Vx&>1fG73 ƺq Y9,ېd6@K^~ɉC[ Ɛ1eZ3j"glq3!XJٟ Cb1'_zlr}9>ߤJR_vKxrEh;믶#KސpbV#:Tr AYq}R%`sNt  tSFg*YV \IԜ 0eNQŹDII֌٭ajgya-ُ$mSBI\9+!۠2f>MB a#`f^FHP=G  B0jad1c@L咁}QۏGq֮]ڢ=Zx6#xx0NQZcd2&&NDkuX;̱K 1zе`6c.0 [If$B]s H7w,_6.w7+A߯3ռ׀{ڮmy|sY{vv~[[^6}{^!s1KZg.m˞|UeYe> 1 ӾdrmNf^BPKP$̪AB&6@HUflVzeױ7-]YHfJwi<_$32"Е!$m&T<,y GutƂdaL{U yQ+x˘Bs%ztJ=P;P9{$!\ (wLy&7`ByhhOWUZ$lˇL&K;/l986<&,%P[0ާ1$>jYNR,䐀$ֽwVKlIp/gǐ9IO ^aI3e ȸK&( F<{n@dUB9qU̱@ 9'FN`j&2 +L2P!, 2" %m$Be9FΞr_:>ƨt%dihv9ˬc6ŀ:4&%ry"P__(=+֜c2C*M7!G[TF2V1cUQ`q4Hl%%^/&*2H#i吤QiM e [;WzG1MΤ~≧{3Ax7gv9Í:9p.5(0bb 3+HR/m<"D$5."O>uCqde 4C \xMpVrp=!RY(xX*␮I7501ioYŇu=KtcI $J2 6y`[?dU+,eL2qQ  o7Ɍ'ƍ'>! j&L9&eVq ldIduѹRkRTEM z2@&Vӆ=Znh}ÈC $ɒ(4[ !D=+v_PZ JIh5^0Bw겫˪A* %4fch2FI") LQ +-C㞦Q+ix8얰ֽQY(%)WAg,1OۉNWH0g4$C:ڜ֚_ףoM/5жdXa.,cgx%4Ė֒\j"Q~;C1{Az =ɕ LIPJq8! ̐T&v,$VaPQHgv`}vu+KЂcĘ:ƹB\\+>4?jSNDd/y.|*7q^mLys鲼z2I`lyB 6-l2{ 80qtɎN56M3RGihŵFx+ yJg#Q@"7^ӂEGi4+s~YΟ.8pQ?kUbhe-ދ8FφӣM4rLVlRfurrZA-o/nFoy~C{ừb4a1QWʮ9F?]E>:}F;dlLRn 3]lFlFpquVHbh(Ze'˅y~q>pB }Cnuӳ)jiɚMK )Kb?' }5%#~̏`|.wV`m GI0A1 7SZS|˩>SۚOX=#opno85hjgmOW}7ߟOC)c}<mOʡ͞IVy]E] .Bd_̻ßOa1DK՟RE;* U!`&4notK<`'ݜ3mC 00lN:Q1*`9lY2! XJ/qX,}l&+*lD][or+`X,,rA8˔䵃TI,JԒhyXgdt]2guh)!hHq-e6vlUp7x!9t\WM9Lϓ$Uof{㢼p{ӄ:A0AtK/v];KM~u/zPKY~G+b 2Gn{$YߌKIAWaKl9}'fi z4#u$A Z0&EC s7L8ޣ̗ &G3׳-=gf(؎-m>W׺5^ҼbnIu疝Pr6[&nIɟT- m{+ w$]w^잔7P[/_rG|kRngnD[s goZhA?C`.,ϽU`^t P4= 6 ;)!&CSR'! )}=nGLcSQC(t2{4Ir*PR#C Ӿ5+J U)z^Vr}qΟ:=QgK٧wKպ.+Pkc;!gBZ+PݑYi{Gm !E' r"A,& RMѦh{:-8/%HD4eprR`ʕŃY&ͦk411jmXQ!bTYGqJK59{gY{G㺑sw4@% H;)[-V1OiL^!ͯ1bF嵳[#̛@Xy:.Im  =`GhN y,eH5ɲ!De! ١W=h3ySy@|1$WjP))1`k95X.Wewȣx<39w+{aCoFcq#{_7.wŹA|ޟXЧrprNV89s̙}|ttrHG\x1ܾZ?;, 0`Ry)a쇗]QSA"Ԑ?$*P>XGZbAQ"f8pluRFT["@Ryʇq] $gctK̏w,5$u Z~ *_j~&t3A>N՟![a1seomX y <3QָK{("Q(dT+UGE&$O=`*)Ֆ`&z$xVTAGfZH+)Y*Tsͮ c˹[U>eIAwOCzAp x6@"9wѩ4Blq_4b7F&8i[= .:AB)#:o$|TD *8uj@  (ȣMd%ᘮ =$B9;t}DzYrX >:q}qҋ^sEV"חA1E%NZ<&hr5I/>^}؍;E޳LTy͟<3]lݙVicIpS3e?Ze8a5vyasYw̅ZA.a.6XM|):n)2k.y̱2r U)EǂڂOX@^c}tMth[)d:%8K)f„V .ar$~oo݋jWz$5|lKooD==-aMYS"Lsv42^Vb+z `.ڌn@̄W :]z|6.uٌj&lq|$Js+!b=14(@P 6 (ܵG d{$ {/ZýxSsF]DlUjj!'cFp\o36h2h(ŭdT|4 Mt&u ZL2U;nXȩ |#@ p7f-f{?fl3S=vNvD(K eq@ Dj)Pj[*P\J߷{Ng^0_պs\f잊E곌.n~>ꃎ*H\a0>&Wp`}C\6/U-Qv<6YgV =[c'- xjK^Nd-Ncc"m&XdJ9{+uBf|k)̮ DfCJEcaZ,&@މ١\t辀ͦMGw7b_+o=Nɗڦo.ʶfq3fK 7%0r nFv&VWk! ̶kRk[RZaݯr&}n"/O'9{U! kPNJ&b |iu:ԐYR57 ߩ02뀭ѐW&Q+TP >8ESr֍l//@FF=>QSNՉ %Mꩆgљ!Mdem5 %$=EZQ $˗#` U*D&Dt'R~rArrOo&?y~s54P\`hM6Wr>@Y tJm[Q*nR]ǒON?M)/OV:Ǝ%Z |7LZJ!R1-` J w*7/_U-Fœm ~,">}7v d-Y0@IXq׌ uwIn־Θb"MzBW^oCAOͰtY.KMW^lnMJDKE$$L&ϔl"2&tƋ7W)U m fHqp bNaJ@F%\KXФObt>5uz彠Y{,n6\. {~?D"NUYdJAf˕U@K*耹["V[ɪ6NڴI! fB숍^ z&uw_nXqj?-rөo i rNٳOȊ+tͥPY5ї*aJ |%^fJE%>8*;6[}0~X4&!⌊D(N . UGjpq"T7}g*$H(H}jgꆯr3آJ 2FZ/k窨ƌbУCaXcDuecVxvusi|mՏ#Z[oϵڶ9'yW^?u,]a1Ƿ- m}"4Wgn"6NR _ @ O)e^T6 uZ.C>zy"jk>op oNE\N&Y=ksH/{~*7^Rɦ&3wU7r6ē~&-ʲMr©M4@h4b9&NҥI0F1ϡg&Ww1\f#DI5K0U,:>^$h(bmA5uc-O3+bBJtleDUl݌|*q6=XV #A#snEwzV,Ջ]r ̢ 839C'{b|}OMݐn&,{`C$`,Ɲ|8_=Y;M7 p+A{kXr$ޫ 5Ď9߇cA& ,zď!ΆkRs 39?ߟ|ӫO/N~|uo/N޿v`p؈7 ? %"=׺onѵhk*M׺QWE9~zAɠΚٛٗk1h:N=LqןZ'>U`]wQAM h q'FÿmU]b8/f(0J́f ` ^ ͩKJ%#*cTPzhC-Mc`<蹒"cE faXŐP^yXO1,<`n&$a~ |{Ցٴ:8~4xl#M7MϦ7]Q#^-Yn9~rݹ{zN_KrnP6UfiZieOr$NI)% ucW \Dd+i( ~p~>?(/&“Wf~VTћp>_ aA~;DIr?t)~}a+nՁa)X |0fئgiXAgw>R:xV)' % ӯl~Գ:dm}ג)E 0; l!/~>z.W oBkJ;T*8:צ*A^'4W8 cs&`bl!CnR(7JvWG:X0VY)x>ӫ-㩨K*Ljd`XYLAԔQX0%=)c o6,vZ7--*ZjwVu܍}ſG{ :ZYK0XDD M,+sFV;|t^ jlP(%A 0UPۥ[Cf^g}|wOxMz 0j+:,KŨl߲^ },FF-ؑQq5ƥ̙,b1nsD;,\ّx;`vOjYT{`Ojڷ5㕋e`JXV<`) Ł;4E>:yvsڔIc5ߚ?=9q4mG{"_>l^nk>}|@;܃  IO O![+ŅB"CSynK`N"D"DZz@ N0<`PȟhiD<#A[@9P"NXKD)QĈ%o%zOo%&ζy:Ls Z ꉓ2퉲xiGDuݸ~+:8zd= 8RA0+uD%'H,"]9,* { :P3tccf]U-xv[ !*C7!u+mˮgoFN)=-KmU EBXj)f">ꆕbDQh)XfH'o/R^SZhMHKɝ#<`/8Q/a= V(!9]_oW>F,wuX#?p|Ɠ¸p|#R'7XdQERVΘ!,X0H<eMYFR(8/L <3x x!ց'H}~_ wI@jr~Ra(Uo3֕;&Ί#|)@Ob 0~$ ;dOF ڊ _*?]Ybu3QoVLٴjM`5ێכVjnShِf/*FcMTM3w'i>V+R^W$q`Ӣ.Q)T|/pNQ7M?RƤj/5AKŨf)uRq駕|1+@ZC 3oAW3|('w,.K&6UŒf Jb?\CSiiUd`Q&1ʢdOﲅ2ouiَVu,5(6M#T+43C:/o߄9zn#YF bCYθyjUKP^ff 3wƈA_of4:_!(V <܀a,P,F'8LR^Kc4y(.(ZaRu_ NSG^ڪy,GuTz0{s{yX9f`U!TLYCmVdz0E߿>T?[7^yJ }2#4櫿7%W ]'101vBM\uqC-GHkxS4"T51JU:xm}s`oǼĸKNJ2Goi\nJqkis仍HwuW3W[rmxaEDD(rQBLI1[A9hFR7ƒ$CvNVF'{ʾ Ж"!ZG/Ӌ;^0LJ틘L=w4geJ o䠵-Fr;׊Ћ 0t6k+mxe^׮RB['EpeØ/aC:+R̂q=Bq_`B$NIyQoʈA@f1t0;kTjOzf!gwI]GT*̯s/\sSwbN_VIĄq:"V0Nôz]C BelVk98:XA )46 q *Ɯ Q%%jiT&x|[jA>b(/iTR,? sI#GE; m"Y<`"Hr`AH[vz)fy %^`6Y/Cii'^y3ۓ1+?.+v_Z/U[hڛJol'5S>2~yEX:}5X `5pX݄~@,[U(Ngjyqv8Y+?_y4UVCs+O.f+s[\h5?Ygi7_/nG\?"0>`[/z^k{%Yf79=n@C,V|}ߡj981FQ3**aF>R5r"a5lx/͠QNS݆C+tDGCGqƫGMݎ3=''wi9-|ݡ2h1 3t@?1UHזϐӄ~f:jfBx/5B1XMpCKQ!GA\RH*լg}RQ[ )JC)G -ǔF,T|PYǚblr>vΎc7c'T}~?A@pE;ғ@|9䊷L%eQ+V*jAQ%-QՐK4Yq DB佢YQ g1p%Ts:ū/Cr,9mki3ryqΔ,gNf&*2y=?,Rn>tcOcRdOk]2E$kš*M5ǪJXkbBHHWѪT LrD\A1=%S&+`*$QF[+F_5XH݆2*la7ζPOnfLؐh7ܳ,뻝`[b,Q͢.$v#RxVb1;?*!APQ^C#{Q@d4 g!{WvB5 UΥnDlb+Xv78m&=!؍yH+*d~8kELЙkE墌DkmN:tUs"Q'¢3/Fn)yN("&eDvv{8eNf&Om/"BgE,FV'H(&DmvLŘOkP6̤xb/dUqWH (X@ PMmv&qʄ=YFmCs;"vv}Nbj3ǴMJ].NvqE֗ASBRpɊ" B5{Mv)nq,gm)(R W ~%t\"&b x!n~|GKݣGW2ܤ2@r)s-~(p@-6XMOx)xGri!2%դ]B-kLRIiaM!|D}t4HIX\ɀcD+U]F@vAPoK̶<"h~XTt>Zs/0#b&Bx*98.T%J;\þeɃg?@6@MKhӭcPӱ' d=k,l)05%,$ 98U 5Pb,z'X&{۠Ib2bia;K>&8єHP4rmM2V:nUyZ^~8;}{m߼ϷLsD- &4C3Ԝض.U] J{-N}?D}ܴcSֳuƚ]U`AL6'KO .'{H-\sƌE^;4t&p𗓿]nr w}ݧg?]@%Rc"Z K# J; !k:.^lXtye%tnF+E|}\K6?O2=9kyɣ>tfn60<[IǶ\I#0."+ /}gY.,[Ub=?]6*={~mkٮ ő"oDeKkىOP怒^(ͺ4mV4mζQُfU??b0,yuHG[ ! PC+Txfb0R\sgϖ ٷZK VB頙V*'HN1Yab *G#r "!s%&(#k8;={D+7KqwM\֒"W˳Ͽ~ߖzs 9r ZTpdC5 }WkW~sU'hӪ7گ`s?ϯn }\_; i N<A]84|}n>tTfaPE!00uJ3~tUATu}Ǡ‚\VL(\N.{svWaPR¤K?nE6ny V-1IU/ĵ kpm<"g?gȁ"b}wt> @LRb{?8Z\h^z!UWQ|S)d˄0jKѡV*TM)4yZA."9AZ[ng۳>onvK /{ r-upu0\[/i=>k=ܴܽ|Km6mJAZ.Uwwܡ畖bC˫Cyߩ=\tc巋'_y1 \>η,Cc<'?\_$G&nB9E=7ArmMPtjsм?gycm|GHEc{eJ9O֬EI'PS2hec SIA qWv)M°M:$'yKfjR xS0hńy{Wf6. =ܬ >rV}6гjY,t~\j^ZYwG@ޕeZɱ-:`]n|͕=Z5!$/"/V`neN[l,r0,(.)W!wR>(>x;?>1I/|>d`r*){ ztÃ`&wJ&1W[4Xվ8Xɋ ßV'תT>o$Z5u5 UMSViZe.!uvoO82i77h;nDq˞w:O Vq'*]{>O6y3J\YE`ws QAO X/&ƭ|iViVX/4lX %@ `a/Z T{3G2%ZI\1Ik(=Cy;xcq`<蹒"O,`30KRX<'1 <`mN<4_iaڎeߦÿ41M*e-&@;>Pm4RKBƘLeD{^pIa h >賬kUdq; DnL8%b2˶Ŝ(1t[MNw+Ku. ]@+ _yBsg(|؁G_@+ ;-И0[.(QAxl)1/9(X,ZExK퉤-&߀Zrx[-6dBJVapBۚ0[2a/Fhr3/}jxs,ݬ&^4#"jY"Ma)em+h[ 8*w1F: lpFmRz\b1VٵUe%3>E^{l~4; ɤEnC@CɗKP._}Bd9o/ }16z{/%&"?9vҧҭWAL)ob ZFg4[VڒJ%RLw;WI@5l4d?<+"G{f"WA^= B`ap5yEnh+Ua("G&dXPg.gr1gU}˕B{Q*&Eq~`Qg$,:-/[,1Ntˋjg+/,97lw^ɇGO~^vzU@UOA~3FBɫR H~5tr-02U!qg`rw:l?({ @M&Sb^Snm6#}~G3sLc8QNiEp`gKkg+l? BY Z6v@e8uK5.+=,`h _6b[C]Χb|VrqzS/_1J]kIL(%A0u8 " +5 '4P`O3Fv*kc&PfTTcrDe`) NPB9Tw- ' 1)}Yc͇~YbHܮh7¶6[^:e(W}j#LcVmd&RmM͂aBkGnȎĻmG ߣّk w~3 7өڏibR8wAL<͗S@b߶xxPN=xcL^k[q,~TYlUƋjRV|TsYlO9I$1{NȣLbc,&]{vϽ' a'\dy͓Ɋ:*a^l?߇96_RIB D)CYʸ6O *Q(}=H IOQtaɷf *G x0ˆzI<PbF$^)%αzLS籣Ժh5f;83sL+4,E <.z^: 2AMj0uW~^ɇ匊G֔ /otgW 2gY-E&Zcn}!oz<,~G: A:~?⟳lH@/.4F%6s ͚B}bHeM>Knj@;0vսQiNv(Qlͽi5n^7 vzU%T{zO.#Q oYcTAF4Qhbw- U&}`^s)?cttB:}YZ^{/4b D% y1%Pl堊E2KFv@f>NL-bOs\vI`Rj`A^A YZP[j$Ǽ%ZH/@;qsj wvdܡ3IܘRp:0f1#\:0lCFL@}$dt`j=nC+vj=]5芷tuߡ F %o ]E7-Al*ĬgHWDQP*1t*hu!]Q% UJtQ|tŔԨI`,dc*mvŔBh"JϑF7djpec-0UDIpKWϐ{4=ax(y^Rd<3N}tJZD!,{)s|ȚRİ0#a0lD+v~-vsdX9M+Lqs6"Zw"Jn=Gb50%1tRhu(yKWϑִQm=Z]E1-yWOĊC/1 z]փ+|:m9⯇RޕXDKWzAtk7"ZtQr3+c\r"`CWM+@+68p(1o9ҕBj`럟{V0 .b<˿/|`?z _y%ub8rs38Y`̲9S˭N3‰uXT:]eԒwhި i1▱1SY`޽(Vi-6;MTKʦVo} 1}3?VM9rM, ]nq$@}W;X{8!ă+O-Ny@ir2!s8*{ߊO[y}.gȼ{eC >J18M$T!RycsO%-}I(?x+P2a?ySϔ O*ΔN뜸%!$P&2k &Q)j |@j3TJ@B5S.EppR%T jCX3((dQwN=a|,bVoRU&& %O5(XI)q4>d84Cҽ硃cg-"jY"M6tԝ9-S1pͦz|6oU}ǜ`t`2vYTKA,d6 ($nw0R+J)@EFnj-1 1#tI@~[{Jt7a0I壣4Ҽ8uf ǷScE^fƦu_xY ԡ̋c sna`\}ތMֶ*ZBD=L~kvۏSam{0ud;f*؎!5ätM1 :tnfKW24ocTʹз!' 1Xrnd;n~[{JՉ~z{y@ :Ւ@o9}#j $mj({Hbmӝ0c'*}/k3chD \)p `p qx5[2#?3/փ|bWBtKTqk)w;SKL~a9a_a䟆)MVguq󒈾R:㘚#G}@uuTgu+N_Gt캖kߏeY9@GJ53`1!BtNjbDnb'm7Vq:DW ]p;Lh5k;]Jv^"]8n`j?]%;G\:saBKZOW e͞NWf(KAma]]W=1xZLpRM]JtЪDi;DW 0םWUBH*ԤHW"-IP( ]%tZNW K+JVCtE)Ct\Jhh;]JDOW/U, +tжL(5iGc銧k Xg*q'Jzzt%bXw`N;CW 쌫=խ](uOW/$#U̺\UB+Z!ztBCWUB+[OW %ҕf.-tLp ]%ZeNWfի 'o X<;]W祫JL3-]JtЪǜ3;DWUKpW*etP +UUX`:CW#Jh)o;]%HWT My*3tvLh{,8 1 "]`CW s/= -kJ(9\`!_u.g1y:톶Ҷ&-{ztnR}wAVk~yx1ixo 8ܖ&sP4>oUrBѾ[~A5KH+K'oe9TޝBL˕̗eRUh|UDDq~iPFIh~,Do1fW㝺HIgf:*No`jq*cb_ҏ9EA(jY&sT:bQ)ܟߦe̛ֆ_d2ؐ,x1}zuӢ[i!}otӏwEXK/_wR_:m^Cf4#%Nu,kb\߷[Wa짓xXG\:M HmW/[\N4b'(rH_Q0!S&J;f2Rʃp-1Kȳ`*}f8Qz)?%Mmlf]694壼T?j?IJ o,>]0n,?z5ïhKċ{ƫ 2=/C e0BɗG卒XU;t(.!}zuWfv:| b~ ] Gt` 1'a>|.w!Y.5(qRqϒ&/|szSޛ;Hymx[ΓW`G#L9Ty|4xyF:[[ AۊN[l r0,AWw*믡@<<0nT,>K_$9SSPɲd[ࢳ(#D'G(?quAطe3Xgl~_XqCHoTn0eA|peFk|Sb~|*ζ9`.͗8/ʒzSG4ϋD㏯/x]0!+%w.w%Cv&C0ٙLnxRp`*4OƣUA4̑rT ڟ*RݕWZ]UI sh>?Nfż"|zO rq T@gp\_yޞc_?Q/P``8 A ףT$> iTҐ*iXu^xt i2;ҽ}0֛^CYDϋ/a&r:x&=،`Jt=`C7,Ɇ-(Px*q a2`_v\ {mZbTDn7J{s ì`G 60E:JV T{#QyIV99F5 88/+m-'^.&/X, &52T {Y f<\I c1X,i@V+I9UO42bzd qΆ gjHWÝRڣ&cwʞ9!T9Hv龍"СPJY?UZ~z(꛳1  2l\ޛ^L&ENVKJlV|8>Y'5ŽY,bYEq6x"J[cD i)?id1_`ҀF `pk(ێ]U_?ǟƓۄ}ۼfwؼ)+l2rW^ËUyUQh֘^3{Ve!z.G+-?e*$l H*)wЌ2ʴ1k$ž^R=.ߴBͻ|֭{}teS,$+H8JjJo;D"XP^QoJqTQ( Ȣ`;X+\$Jt@lq n_nRKKhv,+^eiS>W7sAQ[-l(~f]*ͻ⣗#1%RX7/1ŕejV K֌g2X&DQhǽe//[*/-LpXFU!40)c`.mmF x$ )H_-zg՘ [e4zl5ͭecpj..kرcxlzL|Uk;&go8<\f ;.8^DG=΋8v]m{)+tIBh)O*?I4*2A")tV=8'u{YD!3-[kh7|y`+!d'[6y'Xo{"^||kQc\Un쯂ݽce7<}~I.'ϗLdN*7eCo>ŜH"c3ψ*w=[c3b>ˋ8Sa&E[Vk)GHKaQL#0%C8܂M|BN()yai ( e5BAXZ,3BR4/)8щC۴6rD/vTq2@g;/GŃN1P 1]tO *kC>KQ \-r(k{nc=wu׃~vJ#sY+R #" aҥM3h$ hH**h?3(* 4N9bŒV"D)59b zѴ~}Pu[:J>FJ! J9=*Q[Y@TdqW_K_u ~u%,iP4:g5#)8BJ -a !̝d7Zn>NXv%IQgj{ȑ_nz—⛁n-p;}"m(N o%-Kln;$MwYbW=Jm}hcSAluyJu`_MƨUb#p4An5oDuk;̀cIKcR1 /<( [,y ^L\?&ox n_wSo듕q|?:=ma'kɼͿsy-WO78Y^-Ve#dxo9㵐y-ºVP^S_J@r1L]QYugKޫhrP(6$}J .dJS?WETqʴL##}WV njmmWM_70O?v!##UHA:@Y -!i#م `T<"Q偉UbL* .Z ڪPy&>!$lD0DtPMDr$ f eLIAԺ`CB5 yJB}IpnTJ츨ه-3ո9=чgUy6@`;+:1 Fe/- lr2oмa6s<zBψgyy18onE٤u?o:xYC%^MnVz_S?=8n/|jlKU=(룾H,KU 6&5':d˿<+~eor?wzwM^|9eq~2=_^ a[N_ I~3xndAը PzM.z11BA=Di#u7=iv>|[|~r?;EX:~g'%INJ>'k{OiJ `!@ЉdSz̾YwkvCή'ݺ.tO'7d,g _>_>?}OFv*QtDv'ߟ0R& X*nu3~yX#-:_ d׭N 'ϥ h[rʀib%6c'qcLΎ܅ z;TwO=hxW5vh0NK5RP<ק x̉+kPNLnr(VhEvdo72HJ#e  8 wp뵈ᬰ'13N]aB~}|ߎLeIwN#;z!v|~YϲS/ytfP"k"`nyAaDՉŸh@*DsQ"ܔ=%J$-7UrxAr(L*@"6DYb!fP z^ `I TY'ti5۬;)}%rD[/i煯5y~-B,Ip>/D}@yH7Nಛٟ6`?KvsBtENKt‘|A. ( "-H*toq Hy[NCJޖ$wmڪy#QS 93ܵRA\c1mL uҝ=GRd+{ԨҚΦJ1&GAΐωGL@6%MYQ*kɁShٙBCv7|Y<4f@.۠d-rDyL^:)x>zȽ-fhoɶ|kqq76A&H@Gg 㤠P%@(%GƄ"xµ*DQC!TT =9 ".KJ0HFfُJ3,l621 vuyU$>>lO˴~_5tz񱊂q6G6"tsXe( fI0W+o#.:{IFTZU% h v‡ d`W9"F~Ƌr1FPP[=2؝5"~Xkxk2h`}"a( -k 824} 2' E'c :Y,G(!yͺ{~Uq0FcQ7FD="∈;=m̄"Xg%V8Ա#/΄ sؙY"f/Z8`ߘIX'59P &f+>%jccDl֝,FFxHkFcq4Eqqŝ5{e/dQ {}b"dUF,B v - jLA;1pq(xlt q#Svjx P0 1]0b%s YbDLUUde/.;MVB!;;+эce$r XPCɐJq(hB4m tפG%4)$sE҆b@6B(Uw ^j|?p[s޽[jzrWyft(Bfw *y6 UcT꽒|JLg&א]nG_F`Zwk5F$SRKωڸ$I)4 XVeU͈yORij(W%C5S?U2)m od˟ !Y)ӈ}& {'[L05FUtWq2z|zV̵:qݴ ^ʻqsa#zdZ:_cyދ{ɔ8$Cg \B[$ksU0Uq5Zح,>h)u min*Swڈ]֋H}CC̤چHYr]2WղlB0w 6B&5gW[ûZ՚SfLkJk1Nz/է:r/O|VSB[5YE&s-#.Vy] =b%![M&d̘*f`̪5n%TbblM&ά>N̳CKcE ҆V6ͤ$2 xS Lƹ#Jːku&_mPN /+s{뛞ÄF!hϼc>_(r.$.Ӛ H>#A?'&eC1>o[T%nYUC Z9+(%)fUzǪ1Hr'&4}-wc1^{R 2JHm3e/<'aRbr[.L {ՠ'n.&teOC%Xe0٧Y A"cF1 0rƫ@   6TyE0EqE8 DcC:WȔclDmaxLsLv( :^-5 X q9YBy&nDUPd _&܆sAup%OU8FlU&wVXbYGc4i4ώUH1'њO,,pWO +k-B &&blнŏi/X."zY|Rs<+f@57.WR̨Z! cHacR (ЃcF>@h5-+T^eV***6VYѸy EduN( j=5jq^#úec4BĿ_uEr\O.A"vyq%E^g*C@yV;ͨ>rI"WŪR盉0K1:Fff%Ф1%| $"0@ES@$Ti@ui,?ujOWd{/_a*;c70& 0@5aW^Im{qlؕ z/e:LUHnAlI?;=voqjDS6?E0+BV*n;?!(|J wI 䕐(Z{4J XkW^AJ'2xH DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@zJ EX@`;=t%PyR=E%@4"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R=]%9yIܚlf'^݇|ogG]}Qk\ٴ oۓ[WI7H9+i^8kE4!Nje(YÞ^8דYn na//m׽%IimxyX=2{9(3pib<'M.vkYߒK7l_쭺g(p*NJM?qN!u^ޏ:oUڄvO}UO^:apќz\7@r'&ox19~'.Wy>.o;z]jgo:XjcE*Юu*p=/>RA`' Rg`F)@КB ) RA T$$ II*HRA T$$ II*HRA T$$ II*HRA T$$ II*HRA T$$ II*HRA T$$ KǤRmq(`7F ZeyJ X)$)+yYI7O_gWl,_M?V*o<𸣹uL=FtLc"HD:&1tLc"HD:&1tLc"HD:&1tLc"HD:&1tLc"HD:&1tLc"HttL[u7ۦx{ZM]we~P;IjoE'[ KwWk"Rg?72heKYZpޕ<*gC&fqY -6ؖyisVgTE6*.XoR! nM/zmᴻJ/:g+a5X]R<f,yd8;mp-VaΤOYE]W{Ntuysl3 P% `J.+@!@pJטǥZuc v5LoN69Eب^ #S2ln{x 8ꣳ>ZtGZmtyr`X\W=4Nؿ {⮃^Ży.7K#6[SGb}x;!I_ezۡͩZ p%-8inP z@PUj,BT%9XI[όdd4s>5@}8J".Yq?~H1[f8dYb"o:}\W#Pzo n%lHńYgzNKVbsV!oM亿N ~EL^oӸȍ>Im4Lbe?>5G3X+&|h⛁;?@p1MOi]HDgC!,fŏ[.)Up*n']mWnMdHoߚcuvf72+P60JAţA&$I$B` +K3^k1pJ:fTp[u*j( g pN\ nH>\ftFv`n<"|J/2݆ H@44gk$R=HiX: ,@DjxLCvT;d J7 :g4]Nl=hGF#;9d3W Sԏ4"` 3ZDLEn _=g%s祔`4QaH"I G]$0P%жStq<'ouZl}G PЮA{ذ Y /⅁Fϻd: 7iŸcQ% Hy12wƤRLR)m?$j}ZM5_>"6Gؔeh`RN;I[q^,ݿ0E.CMɿiALۇv{?w\˖6̦?nۓˮ6M~3XfZk!VEdKd*;"2a>ܐQ=X$]A7=/82F_ rGI8|F&m6⟶8ͅpQk mCnO:LL]x 쪻eDJzwl8jtA[Ëʀچ'E ԶMSŠ>=3!<*]UpDa1.1r&"w$ç:r EbzU^4nRmRI%LP:-CDQjJ='uo1-zt#Q_ɷ҄OaR!XZɗ!RIH=QI9QC,}`is삥%2gt(BBNi*IL)NdJ/ 칋`TptfA^ZNq2ĭ6}|N9 ξPmi>FuX\(_Xr=gЪU3O% b:FxU8keixZ@̕ U[BHxTbp{⼁dK} QD +/N[ )Sy΂G9]΋=Ӄ /KKzXᎇ׻ems"]![r G@dAA(^#۠EA5p5G]4(O Ru**'[U-FbV7nt4H 'ƣ~V'V9HU5*:Ud&2PG@#+$)Fz5N=IKMAsRE;bZtPKQp_^hsu7맛ϤWO*=:}YR_]v{{#%=AŦ<Զ{Qd0ٸ@ըp'W 8o-*zZ*~;w Su ZUSY)(ej)U40`Z20mȵ5 g7$z* mqJ['ކ՞}[7BA1Z|MʒmpJo5$+^. "a2D袯N}E.H`XوHՖiMă9ĽZmMJR 2ZI4Fg292;mptZ*?@`V}0uu=dPB$V%&q2qɨhS@*11Irtcj8N*#.$׊W1yXh+KÅ֢2*1ԆFZͨjҠx?o$W RN{VlmL:)|^20]fH:$(ȜGVZjHd*Z\!1 QPL9hsޤ bLIQHՖՆ[2*da58Tʲu䙋rn 21-nvm ?o@`0/i?VG6%Ή hr\5^I|7qù .#R)mY2e#h\"66AK2ɨQ2.ITKHg0mD0%\v58;!ح xjAf }tFHk3I B$()D=K-#Qs#'cqyEL{U yQ+x˘Bs%ztToh;P5r68D:t_MlZG2Q B{5e3\䕯 /Fa_ޥ;`rQEqM6 1)K-)AL)`GT# 9$` uov|}dNHWX2̆ d%|#@REx=7 *ٸV̱@ 6'FN`j&2 +L2P!, 2" %m$Br9Y#W|b0.FO!KExYf1,ԉt1(ieRt@e*VӖw"Ц\~-?Sg9$dTA[TF2V1cUQ`q4HjjI_tvN|9i"LP!T1VZXIvPV ոsh.Nq8irq7G≇m1w xe 76 f֠;'άE#-K!.F.*i&Hk,@Cޮ<& #2fh 8$kB *V:@A 5 Ŷ( 0_}?b\薺eDۥkCN QN ]3+T@\%+m)%e ,Xc*ٯb@3]Jk= ܣCȺ5UƓP!Dͤ)Ǥs*sA 6)9]]T|iԚ(WQ!xczI$մ-ѲBi7Gk-:[9J8Ĩ<N,L ýՐ@JBܳb}W;bj|:/%e{^dq?B$j#tSptE#H4LRD0IFu% n[yl4 Eqh֍QY(WAg,1OˉNWHg4j:u9{)5? B4=ҴѾxS1fH9c3GUA[J*rM GE ҖW.2%A)QJ3 X(uSTGZ [qM&B!9fؕ i<^eL|5 C` _qd+頋A9U|iR ?T(.g‡rS }Wt6̯4^L~AP6N؜9Ϙ !seTjx>ԷQ,9$z-rޥRQCHא;KѐЀɬI49W+W ҧGu!}]JfD#jn:ǵ8FMðMuLV-:6J_ZqVҋNP+=]Mo/5z۸~^lUpv 10ǂsoӳȂ]슃mEQgn~Bֵ$ږPu͈.6*H c E#XFA|6{{;:'$_۪`w:VL!i4Ұ$~8 c^*#~ȫ8 -&%1PE;ơo2kv=fr$W@WQ׃XWl󏍵\v: U_Vإ* cB !uۘo. r~*N½.IyIfw!!V鐬mAd5EȒNzj\ቓmC_Ɉs50+K$%ܔHHcJDOAs\cen_} }m'ׇ0k.;.tyBwh:)G~Z_ݷʃ SF5W+R;^][6ia+uv ٹg1ujRF4&l*[$6䜊JH* T!Rb%TBQ\#!#YQ0km.3Z=bɠqA Z#~D:q4ܣ%> {z^ k{A6/$WӘ޸SȧݑL xk )":F\hh)ƼzKM넷ᝡAv'h֛ -:eRϵ̭(1k}L £7N4Ľ0^X7S<4?G+8^ktV\%r ugbZ{s=e0է䬺4׌Ho_њ?pÖ %\ |fȵi!h,q3QЫ$b&B Qp%1 _ Ӷ Co.N\_R)zuя* YT,hEqVo 7Z{ݑns`ic[hmnD.7P$l+z+0r]^Ӆ,Jd Vyl,WŶCZτf VV*Re3 wPMXm#ZFn5T]E"؞Vsi*W#%l"4\!%Ahs" *Z`I).u),Y[Y! X[cL{MB+ kq\_C]IhQ+g\ O$N/ :&BLqz̻ӓ5+D?lti: kz iWys3.2QqRڠqz(qU>\螿Usz7C3ԨcO3OVnz;Y_5vl5QgWTY{]O`~aL.FT^r{.+V?}^^93S K_^mM}l: %"'~ީ_VGײ<_ӕYy<+O.2gden-į:$jk%di9nwMp1Yix5(GTђN1,!cHK7gOIM9K4~P)#dǧm(ۂ}eNhuk ^zKI PxY g5XmsؖsK:47o׆7~l0#{ڱA6 z`^[i:Wv Zde;(ߐwa|fvly7/'7rfǁ4>L Plq:P;ϴD6 y&TLk%:č̥\B1שVR us[uF8^or > ;D[Ͳ,7j:ŀmfr+}6XBf|%&hgs7٥8PWˤ,%JtrP/d~??ahyTu:ƕfSpq2?v鬅՝@]~G|px^?7U,a:"vfGx߃;8>ؤ;yWn+߹+;BebJCVLŌ,F3\)v Q!( y jj"7,KF=N,{tBKgdF07I1٢BWk%Zb@Z!C0@!h  9sVҊG&At9ګЦ]AI*O% 08h.lPpYi`)#=a̵, ÃΙȂ*0FoXg07LfDc :lT9 HWeOEPGE(RAf$Zy4(9,*K ;ay ai=-u ~decL@i 6 BYKtW2AdׅF}[~$kf('rtG:ϔUls*J#B}>~P2qy>*PxJ!B̘2pcrBe.Er7s٭!:rCilPZqTxv6g~4|[ã >x9F+B.V:>ET685 g= 9S0{Ŗ v&30 & g$~:,G>q;{fx4F{5-Ժ6ƚ ,Τ$wy4O%0( eFg'2$1) ɘ$Yaf<39m%/1K2G匈n:[mE6BWkw19޳RI0'OWwPs z+RBe& xN*'-XЁ{H`K=m!Jp\fo*YwH_I|_^ pnl2X|ڑ%[;~ci[3#cFnwSj6}dU9}mrL sKZzRٮ/@/i^O!&&y?^q`,uVgĂ>G?z#~Gb<+@w?UϪ fж{E/{/41iFZ( 1E<)$;O. ptQn3|uEpmޢj޿ de&_+$Cٲ(ޥ)B攈`"|{` TȎp,&Ca ؀ 84ЩdYL SU dC`RfǒQ pEc$/Py ;V_#{d;FW˼#$ζjɚE_[o;jlx՚]%$bS2hIu .IAdvA JN O3ŃYkzK<"OHuƢ RHg" Ea`yX}~z+W O* c`,1iԥe@&"i-^D shkD_}R^YoB5:+ &ԃ@Qmf7R^aċ ݭ.8O/}9ff1  n J\iJhÈxyQ\~|i~<Ӏ K}3?~9X3;ay}&cos拓`qL_vL{'n:LC+@-dXI;crҼXq#S޴85/}j^lUղb/U*u&%őZ>-a7s_)qvd$wqiпqm\6d+z|1>on:Wz;+'}K_2zM>#_ԟ|]Z F (^S^B b[h=DVF-kc+6 S'BR4߱rH`8NkΞ/~29%QF<&)NK:pa4Q"rA9zVFgd #yCrΎproݥ\*{yb1-l]{9oW^tb' 2JE֮]H|2Q_wsapզdzEn{!F"!2U<+OM_ivYl-aq~eK;<J=szA9[)v*6y=l ) >H48-2iy{іʨLMy5ὛuU:nU<ԝX!EّΎ[n1v9<vX*[J2v& DHe^rRVd:N0GG0Qe3r*-MUX^ ~' Și0\Fe$.<ʐ3r퉲 dfXt #+iQ{8<8=,ܜXADBƻi >4 ZfzҴ^}%ʍЏRkE5@hͩ U2xZ~ 2τo`9 Yk @vd%K} iLEd%*JԦVeBeɎ)0.^Q(O :紐|Xʇg!\pr̨mV*RK""P%>C`"zm 0/A1kaVH u /Cn;Gu_2E"UUvٰ+qWIU箷<fxx暲[KʹoҼI/Fl&x{>7-=R if>69}wִKT ͛g4:44ho/t]ԏƓ_ibڼm8,kͻ?5qtÔ& /ǣf7߼Uכo7)yӼy8~OYYʀ7l`r$oAEbYs?u/ ѵ? OtVдWo~t<[+dCMgŵ0{h]Dyr,W˜Qxz$B[dB͑3L4J8}HzK! J?OA?OMrWg]pyVz߂}QWNG%w|1GďƏ8g 5k՟_^m1-n|K]f7 ;`66Z,h>8O 2s(Mf0Vםi G"ܧą`[&@u[oQJtLd '9:2n&ΎKwG},$}¨`*\0, >3zQe㲑PO,b4yrJ'fkY=*Cb^&@+LN1L85{b)zzn<:UKG*L6.JNs'WzU뼵m-&[Ԡ%#8@@].颅HUD4xNSkYقgآGѲLC8D4? D!y1H tLeǔAKZ䈠!f!{|Yݰ]R*)S2r_ `#wڐhdWM'HD4ki3}g@A_ ?+Aw~9dz_lLY#1!`K(J̥k! z氕p&pHL\ !1!1S " %$Ft6leEaZ40gD:UIUYXfrQN8@"\}Fw A+jpd9 냩gW$_bʵ*e0I(tD'P'Mz brtnΖ8FNeVұ(&/`TZ"S"l=]K튚N/zO3F<5S04L|g׷Rj`ʭ{}5E^υV'umY"[026cґ%rNl ,Xf{ik2i"+ژ%dzR( L9ѓ97)ꂱ8W3ږ8-c=RV-[z.c^d&qb=|Ɵhew/b@M9YdPs@g0B}L˘fB+3P3 4ID!E1R5D#\4. p *lګnq<+s_vޱ/V[n, R %S~L%lTj3ZŁ2'ĢLl :hH4T01HB[a5qÞ/ø7ExBBQAG>gcL6Bٻ6dWdc3RwW_}$=<$Зj0E"%YYU)blHDqZ},<ՆYDBeֵ$_[nS-TUt<=QI>%ش|oNIM\ ,P.]ʹLf%h`c8nrv*Y*-ڽg&1RrI. 1'(&(hcێJMg $⡛wZ;OiK\GߠA\QTi$!!VdxbƐ9"0Pttn)[s}}}Ȝ'qe 3e2[&$m IhM(-+gU:,{0hL #Xl Y ',EVVj򽃖鬵Hg ]YsPTz)4@𞧜U6)[RgZyJ^.& *hm8%6Ik9cF~ ʀ@rBhŘui饔D}f$ⰻI_loEiHAۨcS"pPƦX$j{Ge0/6&3KFn;.;Ficܺd[r9+L:sBJIIVb9Sڳ.Α~\Y$oBW1dXTRI чH]CpVp71 JdXToQl1ciiMA/EF]&FM?_0d␰u6y"g`4֝M&\ oڒ f,TK,ٯ kdEnPZg 4!emqd"J9(a*( MΎ9͹g>2dCm犵JuPpEɸ* m`]B9C,$xN[ΊqSnI&V^d= :d(ᐒ`i0@gAcrR)dy%•\QQ ]􉄴͏kT|V(ҌqBJzeL1HAYDN˴$Pj3ƅ)G!*_}jk('?^N7 䰺蕈b 8aտ=19/'4*08(ևA]cIcuDޛ;Z,0yXKMX~$+-,g;c{džP< v]=~6i5C/l\X*j(:PkڐN<!>*{2}tw~U{|JƻD;{0mӳuk!\Qqv{qF4E SRq ;$+NJzj΍4Ս[WżF6f_GO~j|1<r-!1̩ndf;;g 0ߛnm{e#)t$tlta["Ä~ãh.Iu'=9 YP|騂?dF]VR:(Ѣ82C}ZPnuZ(WM֢XuyadjugK_m}yׇЎ+䡶LZ8Z1`Wfx]଴O :=ʑ+k?70,itXH\NZ"$1R芫ncJ`f'V81@xE*Zy3#"L etI2;k0\zwYOjaykd9JRx3Ƨ_:@K}Y 4SAi2K3B6٢E4Gؖx[!1zhwǫi PKb`<`2}.gFn'o0y){ŝ8j7#) YdEF~N}覆J\! T]`R"0H"$l {Uui.H{ېՏoUoŶ}ѵnHXV~;0r^҃I˵,=o<==_s{{l P뮰"'BZ##dU^Hۂˏ[T"C c-%ȵJ܁4')j[l[lC+ykI ґ MihzlxQ+A%&Uk(IJRd\\g* yZh mkm:[R9ϡ`VΠ. _u%/?l |6tGH.B6!Yb=Zez 1qY4rONB;|p3_SM z>1 bkSbzS&6%YhɾأX),?vڱY)᱗x6V/ F8yu v9\qcI PbT[mfZ$Km*DEd968U,ך22Gmş&eew]:8ΛW;/V۷vS4C{+RGfR3?#m(K:>eYP^!%ni%7%J/skћ_ߙ|:k:yGE1NoB]Xow?wntc 2z ioYN2#mg`Vj%U-[_ZN" R2"D+ <*Asw(qvU얙V#jvvBV "" N+ѣj:\⠒T}ۿ F=AOkz|BoX2}/iB:^{|0,isZcV W>~ui,N%LBp4(GmکIrG!T)sRe%J}&+ĨZdlS(\$~K*-b~9fEbp'n_:+*T21\ΒGuq_FfaznS&!@oWHo~|_َ>׻j9.i^'Ԕ^%@?p37Ѥʵ{nFGgegx/'1Bl+@~wvI73HLdC@-)^/zB*.9df cj'^׻Rz (!ꜗ֔4{%`Y;O* (B#x L~t@vȼ26Dn%5C S?lXD:5{ 1yZ 7>8B:/9˵jr3i,@/{WG^. $ek"a;s;ppH K=T5 ئhR*3T~P.ʁs!f]M3=fw܂tQjLBV,4)/*Ami-ko0ɺ50nA[W0nM 4hEG=?>eXNfChwkE>;aV`Xc8dP5ɞ2Ⱦdo>y#} yczt*A~Q3(Lת-VQig!Nyc}ƾ2,$O=|E߉Cg2߻mimIUijk{&J`C 2XV{ ɲpSC搆 cLoЈĺ7HP+3X XQS0U媬\ Gfrw}^H5k7ԭ9xt<2<H2d}̲SpTz_8NQt^㣣>!ʼ逇3$jȰD'`S6,2 =D!2HT-|rlDQq*L1ɚH)K&Zh ц-UV>ҝ>̅3O)5ِk.VT>L(Ep>} }CC93$Zp(K|; f1T6h_ |%ۺG~J(qac d2UGE&4E̦W[e|۷đgNGm5sБYJ9jD֦*諆J:՜"hٸ9kݩ߼IQ_0ݷS! 8R /!Щ!OdR"܇G"NmT>5P55|}"N2@oBbLeK{Jov/rT?QR`hГm֍O6*ypGX^KWjm OɷZ[^Nmxd<] SPZ&< kE*jfGAeX rWU t!PmDp %XS1JBmȕhM [۲ `5,IXѭ8"yN#r9*eDSs>e E#vM5"tֈ0iI#iUp Jy[6&93Z&PPPŘ KUU#ƠZxX@ X@Mmz&S.ٱ8"T2;kn-˫΢åpmlmg^]o/I YE&.*$@](1 1A Mz1nܱ-p4GPaN(v{/g|4zmo"b x\#n~|G"~Y.悉K9P x8(Х"̥`Fp+)8;>0Q'-Ds%9VRSN*(ZXP[  g} ĚжSȚu&mK,:ptUgR̄ m"\5I9k/ j<;:Yb)8y$t\z|\<%4g*~ `D.ĀW Պf]TI3n5C,߷{J"? 3MC2iSF5-?n fۀF3H@v#Z7XSsF]DlmCN,z''Xfmd P "L*F$4\Ibg ۍ5wNs*[h6qsu[VY]O_.V~[)Myt9مZV"3bʂ (Cm]ˑ@&[">HQZЇc}kl2"#Y-ЂNwYn/W"*y07鸁)[{r|{ o"'eDs2-L+b@v@ .Pښ\ÁQ SIAp *T DMbYmwe4bYi-Ep[XEB1ϊ3FrW,63y7R sQs!B]3x>+hYp1C.ϻFRm#]rPyo7z=}u(q\ۢUmł&VWk?Bșm 6פ*[׶:UŤº_M]z<^OrU! kPNJ&b |iu:ԐYR57 c*-:`+4䕩h }TJ1TBeu#gM9 QbT9-,@HISz!(*YtfEYy LBɚTrڒz"^CO?3W5A-I^_n]A`JÄNRJOFR?Ud'=/~}.+#J'Bl5RۖvG1Ih5c옢u}A:t"yN!  |rZpjKhQ$m<"B)m?X^ԃ" w}GF0ْe sY(yfK)(ؤ'Q%xɰҕA)#h(ˀ)䙂M,ýZh}oTM4 n&.!A),:)}2E2-EJUY2 0Rr%FI%0`KdײS+Y\c׉YVALB`!dˎe!䣮gӍ56Xw}|rt.W9J49 Aيd :G R(꬚SI KP0ߗ9$K ~Udg;u: nm ƏzƄ5D^0 ")[AP-%g"Q8Vºv6ϮJʲ؜LIيtŠtAѪTk&_ʴ4=Ҵ־*U)DBPj4.D"[[8LN7qM|~(먜 6҂ "rb[直ƌ UGv͚ #oBS~8_qkh5h} w!- rt5_>99ʧ,d']a~ 0߾t6,ό^*H#yʇ3QG|s˿)N(UK:Yr!DW^K2!͘7gg|q"3j_윤Q\Β]oE?p]@`Ll&N-ys mfu.kV.5Oͽ&n-2W7-xWf`bf!^ 0gަ쏨Ԫ]9G:Z5Y9f07Y7Cd,xvxGgw<  8DI`MMo VZ7EQ&=(OE<}"!sEqJ<"Rl*F,mxz~ )i+~Eݮuq/>z s}9R[pC1lz&%R,i 1נּ^L$]=+{O3:V޼ u~J{VڰN37ޙ;?)>(+fgM<}j<`nДlT3ɸyq+m;ls+1ppv|6_HWzڣ_Ϯ氳CwQr=`uQ)DY+dhST ٻF$W K̈ǃ;'a)iM$n٘⡃E[TVJf٨H똄oi Cnܿ_jR062ާ+U.ӨS0?FqUo"7R{ߑm{hic[رn{DP$+w`䶹}G-,wTpqeٕ{zE_ž=6PH땐An$lQҵig@*ABFȵj#bTؕv=,nO ]n4a\iNV Дrуǚω,h}"'ѩ7@֥Ȳ^gmmF2f(`{n1! 54ܵ7<퍋>p\ m:]/$.n~KtPa DRwG1_ bZ Gx6߭jR֢ xUP>#:j.F^DG)q4*$27ΘLQ;R@Ĝ4bFD))0C)E| {nՙCe,8+*; g@lh>m.llj0*)v(p;,r׶yG4ru%Qv6O2'o\4}ܪ'n]'ϭ2ݾ98|HMn=qhrx[ZԺi]/Zs4{Uq\ͦCjYS˺ݵz^7~"%iG-wtoC17-62i>NYk>Nt 4p杏FO!lQJe(G{X@!C0( VE%^Bބ*$&s&cZN*IirtR HSd5N``F[tAegJX.KzC!ȘA73 UO1Gm`SʛyL&dgӵVroMV^gVb -:><8q ?W LRԻPOϷDžbG ~zUB0F1e2Dm\X% $n"S2φ\~JcҊ--['{߻2GkT\bJAVx!̂+ eD*z]䍓Stwx.@ۛ0d"ݛ0.\{ I`@zO &Hr&Ekc)@ʂMeDZJ.\VpR`YFkOyvR{/#hOr:dL0<.ǃ4`2ipdkj_4c+t vEkW_4Ql2cknC!ZoEJĘI$_ :p l#dPSwV%˹qIVC' ĵd.)pT901w69QqmӭY/$)7eJ̚-hZ*7}GIv$C@KVmY\lK_3) #dGӶ##}uH<{Skнl \4rN,i,EdJK^#ǰtJ.K^5_fDn z᪱7OP6}+gIH QX' d<9l$Bk!JϬQ(^0 Yʐ-کr,&803dhmP Jj*IDX#p*:)]7џ'$;9ʖɇ=R/Jwjtb#*y|b|oVl4{Ȣ!Ʋ2gT(E2\lb RyA} nx:*zzIB'vfc6^9=GufRzjP[ xb^t#y \waXy 8,9޾'B(gXb"iHHnJz96} s({C;?|? ޅ[xv_p(`A= um] ;K]m]Ѓ6d2$wDb$"FY#K X F#SR^?_I)Nb1D/eDVYOM(z x&?@7wp0Pӎ_GT$uQE䂌Ͻ3)(IKoLbY-rpr<]yb\z~5[F+Ut6(ߞæH[0aH)UW?"U~t>;}ُ2 n|_) 3BȎx ;7MT }^h"8 ՙ"3, 8;,5i kÏL.5]MԇN鯤6? reW-; :eneF $"9h4}ҽ~cu^x; Ydj lLVr Q@U)hbYXvQn YdpkaydSI}XghϦ-rڐ2:n,yAx0ANjYh^sK袌Ƃ)jmV_hS=2 Y'{ EBx) 4 묾 iirZT bP_2|KFe5oy㫴4 jږ;}Wܭitg~9z/(UG]1)K)| d 'jYNR%= 9$` tɺ93=_qpgȜ'sy g ȸK&( FLx=7 "B2ٸṈ@ w2NAu;QXa֖ AdiRF)O,tg gO>k͖q; (9ˬc6ŀ:4&%mCNB<Ljsd[A~eϚsLBfH%a(hEe$#Ah%3VY AVOEo'.~;4O!8(T U$C>G61MAw'V NiuE0mߋ x5rۤ9wA UV1XCt|)i9ۺخ.ϴѳSaH@C-23> #2fh 8$kB]IV:@A 5 v+0VT3 95S9AgE7FvDMfMKM7@ٻ6dUe>/#qBWIm%3CD9ci5UUUʡҋ>@z[~iÈ΃M)5@_; @Ipl(!SԬA) xHwR }Q^xb|3FS|YyA&q4)%x4SUɁ_+4#" %N9@9wv/&w[~A]:|y׻t p+\=q"efg+GhBsڈJ>LJN0ctOOBr]4=Ѵӽ8Jģ Vէ5I& .IO=!9kR2.2 7zcbHd'׆Hdބf`8Dw'Fx J|DHfϏ #>t!uJK_C!S|0 boQȾwq^?VG1iDYR|470Xz+ΆœR%]\LIiz • v8,Df,\l{FT@5Wqb9£`#s$d,>~Nj#":Ukg4񙄾vه\Yt٠V;OSum0&xDP9(QRQ][dadǀJƴ:@ 6_2Ę4&+dV)T>$3>xB2M5=pu'({UŜܱSX p^Uhj|(3vxRI@Ti{gpBr=*%M"(蠳1*T-+Xwü95P-9M0_Zmg6jYE'o(#zݎ<| |v6UTS"QU| i$K*^^VQwQ#x:>(m4xB5}St]+6j2Dq1M0zvԷ.(aż ۨ&veBuŸo^\[o4k{-IsGx0.ɩEUm⣊騢H%Ъ]ZZ<}Lwƶ^\H24F\8Jra]7qd!2]axa(5UzMq`ޝ]!+wǵˀ}-ͫYq0?ARi 9(z_ 7.>vQwscuaFZS 1PKwQß$с>iЪ*DfvFmCb/ Q6!-sꮾ_OϺJ%],rVu>#?Q1vpaQ\6nr%Wq^}n~G-UyFzuw͆~%#mmպ!pqYܒ֮vv^24Dv&7{M+P/$G8 ]O3vځJZ^r} J JW D]j?Ώ}Kg{ϺJEr8$=AMڔqK".Eq &Yw\8ڻIB81 L$xZĜS)O]Et6"PB$>G@[=*sc{?w?{\A[Hֵy]U'R1QU̅oԓ?@Q K%C X@"IdCz٫zRFc&*!OT;)!"e:\9hVR:NKQ4818o3 Rm0*QYíN Mg4Z6h>ZF PiKv2d:-ZKhʥ;t1dz2O|Ϭ7 ]7]'VO|u[6X']^O}:^QQb[Wϭ˺q;G:]ݬ:Ė%,jݽm{U;Rb;r;=L--nO\ynWo踹Zv/qUo)]b殶}ycgl~(^o?}Hl[nmHH0ZmH*d4^ `Q] Fk$ag==j{Y\R%8JU&ApɀJ9@IO-;/%mu{ Ƹ$=Lf[s)-n+v@g=@wPKiu.ReV/$fT22W8m v iMQ@BcHAic ׄHt4qau:hW%ݵܖAzp=er@Y^2<;u;Hh$ALKAB6y3C0Kn{y3~#N1 !xdҜ_x)P'$b3%No8T@0^Dˡ%eҜ}@ ~3Z ُ=5(HP`+xḂe3abbr$dYFT]_+p@N䘠-ohٓRN$. Vz;6Bzsgp3TDgA&y^hjHT⽠VK$AY 8"GG0OwqΨs-8TJr׳Rf} >Ng588*϶6G~Sbv:Lk.;^AIep2A/D"dZzA ^0<`А 3b,4*;9&cF3EL3+c__#>IGs=_jd*ف-nlQWlxV ;e!%C1FЋVKM@`u|d'lQIKCȀJD<($+2^/y!?62W_k8A}cuqbGUwO4y/#)ꟸ_$*$cCP b|=b0xfd:^" E,g7R娂GaR .p1h$xrpx[+!h\a:p9i:>+ټa lhE sϫΠ(8SѰ5e9 +M]o#GvWcnCnb YY2uH.cƋҰI(qfhT:wD $!8W6cTߑ|_~kd4a쵛vœi.􆂏bTDuQY IJb1كH>}=yK417:NԤV}޻{ ڿxe=EmAAh{҅^wb#_t#*fqMw'gBiCA6 +"(!?ᔢˊ.ٯ8 2ߐŷ\0~@HO,A/qgo/<ǶNs^/LF%O?8h }~ҟQmg ^RWdZ>X,u/KD7'7K9:xxRJ}T+mwX#w$:ܧdk#4sm6jSdD[_M WfM 2T fk u-ߵ\ݐ,oV+1&ZoGg[fH+qF ; eBfR&TLm]2i (O̢(t&$Qv}ōabY(9#5战Re/LWMo=bƤyߪN'5E`e?ujV%\ք0W&x_~_-n[SnBf`(4WD3BMVqBGxс`d^K): )kCGs(c$f`K 3x%(4E)ZEaX:;nQ|eu3 XA%!\"Q&mQ4u2S<|ebjMfݏX# @j j¼IxxAN!+Svʭj\11A9y>ŕ2 69$]-Y)ًzf=UƷIm"!m}ߪ;d20H^z['&y d۹fpW@2U!M.ѹ>)n@/IUbiZZN *QG$~NJx4$hDc7zݾrieK86RC5f-+T7x_V9^欪?>|o~L'VGE{3ݻ#T9Y?ΏƓjtsX'=fơ|׻,3 T3_]/(D!ۡ2hOP ,Z/NچZ˛Tq$ਪ3ڃqWB");AOx?^k} 'Yy$ ؤ8w*DI9z.d$I)jiD6$]p 9n %u4qZ筍D^AdTS9o| b!EtDvDX)(;8w1eu}z7Tm)?Vv+0iX9 #҇GGM>ڍ]`^D^DkVak^ G|g5IU ^PܢR >QϞ0h&yP'^e Na.樓N8'R ED0e=kpR)\B!UZ2vXFƾ$ $i6#e>l"tnB _Ė#6)S$Sl4xfNM8{ QQv7$2sxΏɲJ%Q3uBnGMLXHsӠM݈R.tW(HbX6;Iݘ4" ¢a?kps4OVj10SHf ڢ0a\FR Ȣ_y #p9ra$ՠ+,Ug`(PTs0!`i*B|,,UgDjrqѧ=bd_( Eq'i3Y Ik&XO,Dy$pPp< L^!bXna[]9u5?<nҚQ4;_QYiMqF-7}\>K;攞 t Z(ppWK+*RT'{C;ꓽ<@6*g7=L-g] =Jb <Ľ7^USȭ 9E#2+N򅘥ږ8]2{Goo<., j&Iep*^X fy[,;2@{G1^:dD\0XT D42 qC 40S2CMPK@Q1xΒj}鐩SwH; rJd|Nfg\8;jKw5h^lraI+Q4*ц8@=K#|3j*]2Di z! N_ l<G#|m N QIKCȀJD<($+*y OO?{h<WF/F$Iks:OH$\ qH 2G ͡dh;RSN/!)}ŀ?JL ,RTڲb_*YBuOU,*jH4>Z, "SxZ$f$CqFk a0!b~>;T&-&뗳MvC)+BaEL)}Wu9ϳWԩ(q ?wC|W://.|y:pv ][=ElN4[\`k&@ۀ?GurMj&iّ\R_e!{cv&C5!-gT*^eXZǪE]@z5yk}wzUK!A9dJDE Wپ_]GcSͪz|Hns;.6DHz"9,ټV ݽ tlovtz^NCw30aLѯ>Pfw5vY`~ݝ"'ъr?=z))W|!ss@*O]G=͵/cT1T;X9fշn8|wM ۜy6J)ТIܔ:aΆL'7 tl>CW)z#vzsXll~kjl[,%\}Z0Ixn-~}Vɛu|חNv%99NޭŏW~r4dowZ>U P^~Ebr|+{aqպcgtc/ix'=j;O\[n%U}z;;@ sY|vw/籿UUgȡ EOΛP^;vISP5eE0k9CiLnFHi}6I~e\;΅⛋/Iyvuo݇_f}X{FPy[OB=` #w`D5[2N9@6MpQDKb'2a' f -O'<- n78}kMZOdmP<#떃~7%mNx}___{rvҒ|`x? ]GU#Z8,>?__~˿eC?/U3>CUs~^EAyE/%oD"'#*H#![€;m<8v(CD0SZP`H(n8;x BJd ?C#wm:x`Aͥgbzjv{ =.'Ey,>jysB/Ѩ;>wz:ʘej5qp=0\C#X|6NOq-2)>c{{8[ݲ!2]g#{% Q/B;*)k!Ζ%RreEQUu @D E);ROx>_k} '| M4IrnrU$㸈Z$-8$)A3c)W8gZ5  4q hd^|2 Y] b !"Jm8ՙ 3^@־><=ʒe:9+#RgG2엸a2(5[`u& ;鉃cVODsQ")l9QZ@Y M 1J2$KddXu (%BМznl $ 24?WψO9?LIEx*}쯘8)C:$JDM}yOXӓi8OmC7Ia[l)*V+0xmq5 M %KT\.Qu(<1YdqIqBxT@RSɳC(2y?sCE>*D;MMHG)sD Dy &g F8a1qvt8ه"gOXL]7H8Jqp"cS 9:z} BH HZvQ2($E? ?h>C%Q 5SNѾv)q+j^ -ESzo%Kqq.7xJho%crzt`[O5fbE?ǥqj<(JI $r hAU*V9G+:$NBOkd8,JXR6!HFblGr\b!+/h6D23(`?lӝq[xQ+9=JI2zix52Ř/P{ i(ylY$M<'pa6T^3A%]Lb`"c"^һَn6 k&桠vq(͎}do& ,hC^#s9^[a¡!%A0wÄ2Xﴠ2' hE{b >KQ$ǘBʉШxXLx &0k/"ˆȏxDč[HdBH% :2f%Q۔ !꤄>hT :RH(FS 6FR©IRQq5D%UYgu"##T* !99~W6Iy+jG#u oPev%QZeNc.J,+_Lf JJxϒG˜Ƃ)jѮ%^e"O!¬2AO[ƬZ$$LhCێ grTGZ#DK?jݣ|U(ڈu}_ w*+fxL)HYJ`O bJI#}<*W,HD|:ν.,;>{ b.! awС s"kq" DC T"K6Jy|ge>km8[YW_Մ::DP*X*9ˬc6ŀ:4&Id2[): D2i;h>> -֜c2C*Djd4"$c*!"82hԝ !oOD@ABb$9*qʵ.Np"8&iz~v0m.!Zh[ Z^v1rur3\ZkP`TY` E+%˭GZ:nk"Bc;FL>HdGC.- oc0(hA&2izCI `Q Plc9- 94r7h|&UݤZ?z꺴)e9< kD& wO/ڒJXvT(#*+0sg:ƷאF]8dS!Dͤ)Ǥs*sA 6dRH#ܥEɠГ`ru>r1Ş ӧFC蝹ZWVjY*J[TKQئuxM(UˉӟtV/+(U wK3͚Z~=yU?|~upv`4a`N0'g+ vk"/EO^w92ԓ=ӦnDc7nI-aLQ-h*|4\,zڟnW6dSMs%UHeOR:D>?ơ|B+%)?,6TM1{Y8{ Do_W?|{̅9~_/߾N LSqC^X#Z;t[t ڶ9oѯmNkù]Qzd]?OI(֣\YTA0+4lbZ[.7azT?_+QNc-B !`@5r }d]G%'A&{4zBa^a&ǝtȣbȭU:$+FsD kdC@ӓa!0dab;pWR"\ Fd@sO@n'1\h"9m2awʽAFu۝oMʤ sc#4ܸ)Oo8oǕ!q_L*-R$v4D+vMopV֧zhs! I'*99BUAƘ-[\u-#8}<LZ |nKJ؜@pkg_KޙL✖ xk )i#:F,(ZJ똷杣t>%F}F;Ќo; eľ;d$3NҴh#NBm3W謪*B3Nr:se 0&O> iw{ߌ5lYQfם"LK;ph,q3QOHFL0d"JcvɆAdE]F\ߧrAeOWH>Oޔڕ1֣^ik)H߽jl10}6Xlo6Lv7ȕm\z F6=XɲrR?q˕e%}gv Z;h/Ami5q8t-wr[3 wPMXe#JFn5T{*W[;zXi*W#%l"4\!%Ahs" *ZHaIj).u),Y[Y! X[cL{M 4ܶ5n>lw\q4cT%Vk۸>v%oZ(yf"U)ࣃ1gF1Oʜ}rIYVB@QXYV&y/;xェmҨCxo8c2E2&I],QEH+kuRRR2 oS9#&.F3YpVzU /[Ζ3p:6B ͍/ed5TMp9{/aD4VeF\, x~)Gg"ݶ8(|Hnu frDàC.QuEլr'tdW~!e5_iyy)1H-޼r3=G [<j64?"<4-/.2Ȭo@YoZbFo58LCuOJYN?ߏ~>EE_:tsMS:MZ8G_eu']}(eE(GXl! D/.oBe92iD'$.jlKiڥ\;TRTf h.lPpYib`Io(=s-3`s&!J?s: 1ɜgh1A}6m[Z]Ox̢u)SKAgmwHP<#ERc$%(bt9ȎY, z4fN"($%Dti 쨙,X%F : +Um;A59 "$Jg(M0@9ArTVCĀt|6-JziH"+%Vd\kG0ܕ,n[ ݺp[nO }ǷMD4R=V}!/0Иaa?iRkn5K߶krRFq\ 48ͪ0+(Gɇ6C[_-Hh=ro&q͟:8?07G1}gz>Nf :x7G8շՕ[UQr%ohm럞oupM6kA&e/c\6Y#,gYi9t^UoAEW:iq'AZL.9mJe].A݇NcRutچ aߖ+sB#\a`[:8\e%#Rr灜E8ٝnkٿ0Dzci|I'9@Lpo3>} ^6>7#R?Jz|7\i{vf/]Fk7i:K%Š1 84}'f&'?̍AmfU";=UJv&f\gzHm/l=2Rn#PbT_*-mzA1ImnZɽ@i?[:wlU"zvZqʱ=Mnd:[UJoLmV] C_]ywBIsb  ^{sFIq%n6I?,^%UҜjkKd){G"NtS47;5H|Ad5 saY0 nT~W7ʟ^yR޲ܠ[]Wr2}4oh pN\N(|{f}Xzů/jy}[<$y8]\V(A6[UߒÉѪ8-%h˂d1G%}I.I_u?UILYkWS**Z (6ϽKNɖ bdG[BysZ;l;zC9}{B֌%)m]3BJᨨf*LF@^S%*D 9X $3ؔ LsSVDG2eQ}h9(+a-H>L Y!)!d*ae4]!4)hR!TTr<d>C~gO*ciͻCsŘW{! QBa.PFL>ho廳!Fc-[PCB'TɐƳ-l_s\ֆ gUNȪs#QDE+Qx[4„fo ~z4 AJNjqu4~1J!uCZqji fNc}4$ԳŬs֞Dqˌ9$%UVش+!K̍Qb&E ˤ0J!B Q}=V \.%ɆPa*I%CLxe@l T9%[0nEE D'+sV~6ԀӈEAh7kx Qw!VM< Ƣ1ID6r lNS"G:k5 SKȆjH,|]V0ڊ0i~cXa\9膥DbQ}ƕ]b-@P]-]&:OEQ ilTK*TzB`T` bX@vΤC@8NVj T|Mt&#%ZM=nXYgCRP6Z'8B6# 9'KQqϊJ2"/k.6`-id^>ѯT%@H/%i $D4P(46jeb $z aΡ ~,|:omBi 30}rnmfRY椁:c(QBu"whtD5! fEgf2Ϯbސ7o\p %#GޅJМ6T-d `Z:pԥ: >o;D* ]45 Z.Uc-EOpE A-)JI VWed䓢Z0ӅicGցSA^|D%H.:L_R˨չO<om Ut`unV$ 8U٫;#*):`%g/$Sƪ`~8n;y's8-`I%d__b=^EDB.` Q @r!PDP-fx  BL)ڽXRr.1qo|0bF &qH426҉l\j%?(ce_!uf1c(X#i%Qr[6jRJPz+t42߬t (j/Vp5] Ɛ)`6Mm+9ӡ~\I6|Q75vfM^4*qfQ:Zjѣй1c00MoGA1jS11~5 UY-7؊DRe 4IUU;I ZeDy[0 :-FjJ xT'#!Ssausj?{.ڈr!1uTM#~Ҁc B$"d D 9k듵O(:E'gj>?o)T__77SAj+v'Y(L%PX2'Y=65fc)kԪh\.C,2ceVMv=EƇHjӥ K0q%Gd Lwt#BR!wTu)Xo$r^}#v~7_ndm:͐ɂMS9>l~@۰'RVدl] ǽ<Ѵo>PUm?l@DZQ\mF k|J Xi%+E%b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V}J Ǥ^}4J~Z\~jSI}Dӳnd%yI┫Ey~uqIΝ؝sT=SoپB*<@lwGEUu _,1hxw9/Sno(ȜzdUiQ {&Ot~Hy?*H aG=~igO}Oq4mӃg4 .YdM t4*ȞT4r*>ir;]c\@9s]l15e^lK*ыwuD~˛_Z,n^X?. W־??z[_{;~7.oFqkxcc@N>^//26yw*˚}[J?4ט߽9>>,՛-}g]$vߗQ᪔mfZ+f*lϙ3BQ?vF!8ygI޵ϐ ]bwob0S1y#kh|b@|v<}&.E'37W/o]f{4g}c s,M>reyb_pxy;i󭟧nrs;7(Ƶ>8w'LLE:k)tRtq|7뺵:|V4Ե :B*Z6&ynֽ_śŶYh"5(RA7F~`7?y0E_.fCQp;Ɨ[~{lm5[Pne4Ͽ\Y?/G *gf :cwe39cײ ,袶EK$XHK 4ecIm'shDj,RaUͺ7ǣf\͜WP[ ˭#^/wc׸>WP᠞KKeKSѻ{sW'hs-uثO"O i 1Ϟbc o/c!t I  8Ѭo%' /DQ*Y uMsL-.UH5+joɿ"ܗMs|$!O[7U˞߯jYde=m`VS"XUGet[T/[CgzYi@Kb~==?n4|:ޭ^fRU5\z~kl>=-*k[Z?;ɝS9.2vfEh7 .O SG8Z],G9)ZݬG5vރx]#w|:otG(FGh}4?D=k{ׁw7d<lX#WCw_ . _[ϟ>C67-6&N^h|k_5S6p5!/)x-85`.1ɋȓ,8@:=*#h&? ɟc2;Bb?fSK h(/\i :HR̈́1UMX+t$:n݊rǠ bU/D~/D]q?di8|x͕ϔ̒ _w'91aLȏ^yP>o,iHI 7BQx-\P4.9Ⓩ@@ :{msW)o{>B=9۫R“&js99DBIV̒I(rg",qfJ{Qi)r52΁|VտU:=E9x-!%THZ๊(3}/rܔʘ6 YMz+_  RDfB_}AE#SR4[ιDuBW\FR{_D^V;h2Z|xwE: I^BR*f(DR*|>)V.2X/{ъ(rDј-YhVٖnv/gbV5$x|1mjTn,DS2F#.[qLDPq!<"<,+<8mjx! rp :/\cJ;*iQuTyEɘEb[de& t;)'eVoQv8܄}x1bR$r* y<iqsq>kK(1 ɆGc .5+,8^H=(@i p4V jL .bsp"UJK"Ts&J(><$y 2m+8NZ (Cd(<\P`Hmkms`̍4풋~ӳak]k9J( hj-M MQA}0ԑl@PFyMR9 ~~~>w^]Ue}fU*vW//2)9hϙ"jMd9=rBhk1VQ~F;n qꉷ2~ٿR5:F;\ZR0 B21)#4a"lr1^gա :.I^99wKt@u`[ߥ&˘08 eUχMZJ8+VNEd|R;;.Dj'{.Pf`d0I$:wѹcBjljևy#Mą bVE8peKI*dC&x-3pZSo:rNBcHAic: DJ A*n:e>٬̢T9c>Kgng\*8 ()$N΃M^r fA;h>fnG2Ͻ! ȜEH{Zn* I8L$x) z/{ 4!l:_t%5Z8 \ 1=;^1G2k^X<[wtIv͊|F% @-.26Bk3K58N rݭCCX[qkC.=q*|3Tx:Ϯf3AD3!KxI2kb (A}fD8Dȴ|7s `x>y'ό2_hӨh'/c2xZ*PD4T҇UL$1\3CMP >CsT9*x XZZ s@" KuHwZ7\: *ف#vތ>- v,jrհU7.xXҊ#;e!#%pJ F3/C/zEO/yޚ E_v>2y^"ذ[% 0W<.LF9}2:[Y\f-I.<~Ɂ\-, Z7vn';G Ik+ ^z؀.Wv^oG w_v@e6<џdbhs+HT,+E&Gm3F$j9S]]գ*':# R,uvkB)_ q1#WVDTu @D颶 @8eCTI./+1=l VܣIs`FPN cQ$%54hFplzBOLu|I<G%33:L}fi Ñ^cy YJ𧓞88# OԧDA"ODi!%2'4 6(eD,B U`qׁ^!scc IEI@{%xZĜS?SR6xFfH.  Qgxc?G@Aݿ @Cx҆z*b2\\ Fr$o{+SB`ĸEseS]D6|4E&w'G$0-5ܪ@qȅQ#Z&O#Vg.-.bGEbh(EҼuPe  Dy &g F8^VolCA}H!;B+ YsͽNK*dT"@,6&"( _6e1QTI2>9~4 2O}D1.֨L9RD}y֧})Rn0UU͡/'{h>]^5QyDWȍՓЊYr\T^-}\zAmF( &2ȉE` 5^%*rhaq!qPDc")U1Ȍr:k-{#gd& {]e!Yң,|#V1EFo*o~ݠz8Ϳp-<ꏸ+91JI2QEix52Řxg. Я7B8 YٳxN]Ry NBPP#$&>&9%Šy(R78͎R` 9O@X uMYgu"##T* !99~oL,5+jG#u oܤo#%cxڞxJ*oUO|^#/2`UJ/U i6.亟PT%ktɬ'[) Qc֎|̀Ki!h<*/Ǵ/bP ܘ]_6鿙qz0H0]M }U)жd^m*Ot04L~65i 8[2Nrz! *&Nlt6G4Mk3a(O̠(N> w::s?Z3XnMǸ*Ɗ*_/=Q0dzW̯_yζZ%T6S#;Vo:Puj+B]f 凿%+_ժrQ^]ھJQYVtEݮ#|(wb~96~Jf _;9l6m6p$VG WT1CX7=C9T:h!m\g`DTpmQdc^J>V(EH9th.Km8l ଶq. ւ24CB꽑]PY1_J!.J&הJ46 2IqF]*TڰR4|Cei3Ƕ>+vd=*:4S0^".AJ֢VŮPeaQ1SY\Jm"YV8(rK,DQ"'okS;#Zc}α[ls me/k\c\>Zx|Mʰ|9VjaʋV^!)BNWF OL3eb"seNE}ܧ$MY=}F%׃eU[go$nWrzl~Oۥg>L6JPhTx@ȝA/N Hҋ>)JʻX yesl7kB΄o WwM}=7ގף+cJr=owPeW7ږ;%\ޓ>Afơ#|i3uws}",kVP\ [GcglrS)4d 0^ M$raקnS#WPc~7i#ObӐg4j_xV<V&W#EB>K>No#Wck1-q?rW/ߋZNv]G%4? h,?ZqߕYm|ʒiLj+E[nޚVGm-O /yk{_aFJol@ۉMo֜ib-_{q+Βr4T{;,Y.b;SYqϮS-hP\(Hn|x`; K:xy2w 3r}<{2:Sљh`s&tTj9*ɉqIIjKa^* fސppmvi 6-^ԟ_ˈr0 }K.XɒC NN!ݠ3QgPYNPK*r}(B:8e!sNl \| FY󁳜s&T ].l*N J˾q/Ӵ3 Oj}/8cُ#T]>8ƥJPJDD l$$kY+!nKHRbiLI܌3 / w5PI&yg-*(A`}׎:LB1BrrS}(6Dʇw q&;:pP\ty Jjr}JpR*y%Rxψ)քer%<j6:A1BNdh}e2/84yI kJI㬐90^Ff(Oll/7;IyUycٻ6,W~0;Nv`fLQOkԈAjR$-%jb| vf8mC Z9+(%)fUzǪ1Hr puľܻ1a\Wnx;"Up\@ /&ZX4v )ԭ"vԺdJ)mp0`-ĩ搦 lgdFN 噴UAŒ5P|UhrJ\̵ й5Z*ڈʤ7)@!`j,Ei\")zg4bDkxJUgd x*WXhXk./ lhBA˃٠ 4 9bv(@)"Pa. W8nvPL@:9.g !./5GȳBmTzr%Ōq2M!̃V;@C>@B e "t,:\"(, 蠱&X50M|DbP*c*ʦ[ciT66% ac5Ne ǭ`I p2 @PUv 45c@(qHseԸAcj V:Y$o/Eye̯(_u-$( VHݳ+Qen0ui) $!)byIٴƚQ$"/=kW(@H̯*+9JOQq@X*+T햡 /\5$Vި댉p0 d3WIXv3~2Y"+пzR|"8BN?]ia;6,tU|eM=B&q`<"; /`:p<(}rdpҫhR0 Rn dPgP j6V[/5~a=(9@ib_e+ JPaZ*57=dhEYR:[ģ&@܊`mmGw VgGESU_~(Ӭ(al2JRpac0v?c^s@ QMހwK }Y]!jUW@ @6 R{p!UBGm` @;k1-PxW3K:vU 9GŌ:%a ߶5* ΃r:PDM \{j#:& !ܨ0̹+ 1xXe(! c=H>K4К5''tX:0M`QIZ#Ѐ7T*[fy/zz֭Zv2l-Ia. 6It*2 c%X Xiw7v9NiWޭ$U]u+k3 m @][LqtU  G6T0 0ۿsFŪcԺ`ZCYkpy(Y5ؾ{'?#,`ë=7fncA5(QR(.*PP<*,GTi'O H@voܪȰn$>^uEr\O.A"Nz/XV0 .T %p #EÍmoUiFlep?{ ڰ(RVXGϊSZ]FʄȍL #RcZ |,$oX@Z2)I1OH2ˇgH:?=(Ws΋v5ʾawU@`HacX:dС 5!Wў iDF9|%q<{ۍ9mFِCL*V.u/cdiVM6yLA`$9D%. &J#Ժ5\״.RC@Tv"#DoJ`j?®z|7݆f'աLN:b%eO_L.w lC0__ҿ{['lS~Q휟^oC%lǐ@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H *$nGQ5s;%PVǮjV*z'g@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJW $☔@z~4J?z%;>K%jR@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)>_%o@Q\kF k|J &I 9* =@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H )nlݫm5mח 7S)yw旳usŞ"8Ȼr9=kI @Na]]3)ef>˓p2+Nڡ_Oa{Ry^ Ӵ8,*u,]FRy`@G̛37Xˡ:_Ɨ#@ߌ~ѮpF,1_,^Ok ߩ ~N ١6Dko:@ڵ8B(eLFy^հWu霎Jlӽ̯f3NJTuˋPT\P(Sjh@0UVyYN.& `Xmb%EɊaAk,-ǭCv>[S":%YժK 01߶p__dgő"Er ߖgWk5.[=,ՕR<8aFrՀ:WÜH|!%CR:-ưV?lϓnF&?Oogr_W'.՟?Z舿vZݢ]%^뛧=/8%|m= a*mmlmKǎZRMTyr%?Kn]oK|"vFs|v\}צONJR >d`KǨb3;/D&xsDU8$-fIZdO߸@lV`.&ESMg=˻A ]Iœ3=?|S#ZU^j϶`i_h2ϷOǒѿN iś9}K[g;a3 M <ɛ{bSBy/:`oO)_>9vv"ɏeNmׂmX[n Ǘ';3إᏚ>4ƥV9p`LcVv.&G~Ǖg=Go͌k:.^E[G|Ȥ5,~m8Ň0uqz}x/Vt_޽x5NFTgeոr _M{3?d\zJON]ϫa+`ӯ=9JFp_lӏ*+w15dLa l{nҺl$g{8an}tkZ#=Lª!y^uYo&O&?ڟo~3l5&,>~6í8vjTlΛ}-҃+ [eč뤥.^͑ZilLxv@(\.j;׶^g/~j{&?s_&~Cd1NΦ*^|Q:X6͌ggg6;O?]\fcF3v}?-9V؟F;o๘v󧧧_ZGH%ϪQe^1tv<KS^uٶޜ)Ƕ dam.bHM$]Wt XXE. w ^?vh pGO]ߑ727!lqz~1}6v8l'54CM]\ZywbTBJW=v|(pwg{+"Ko_+ДS"ܧDыRq"0LSKk⵰dHMpǦ$L5:bf|+|=תz/}یOqO0O8\ju|MbMln-H:O6"Ճ{9VuWGBk,V;ѱ;#N<#Hy -VY_ojM>w͞GDh3!hп0`Ҿ8U52ei[WbE輎tdě?oc@U|M|yu g{Ͼ* o3Ai}^E·vL訊LUE^}KB_r+)F>Zd Q2BSb S=k(b(K)O&Fa ǡ);D0e0KFL sbS(7 ߶> PmLpqiZ[oxforq1e;O<:Q̲C`yYT*2J⸉3W%i>8g~|) <̣'0E.EXG^*EI*<r:]Y/O.>Y:n-3?),OFsU[0LOzyqw~ 1[@} _kzv'c}%l-n4C-Rani=C+CnO6O#!ι]ABGAFӣ}l+?1>/?wwӽ{)Bhxu!3TJh^NFi*01W&aat.(Z+N&ݖ>y.`c@Q!hIQF6f&hj;{ߤ(IOEe6]E1zÃF}杺,+&~!ƞaf=-W<:=ټ<%ppVbV-9fbe^b-΃+0 \fs{홯?_JfQ6虱)3 ƸK$L$$`E҃!00 Q+J%Jp`̧mˋjs7ƝMnM0IdbzK)Tą{8jg(`I7VTxŶb{1l  T}Uoe>Vf_GFQ yY;zi);nZ}i^)~y'Z (N[\9;<1E@no wk;]~~g_n.h1:R@2dɤ',D`^Jo.IƬ뀺Ɨ4H@u@ Ugbg;ur]vή滮dsY͓5["Kɾh}Nz0¯\_ͮ#nv'~uly>vݎdxw~Oc'SSe6d\ʙ)'K  "9TrN"\uDFdciDё)ɦ3.1qLqB:0 Zd/g"_?wY!RQFKu1֎#;70 dhyxgvgwbmysF໳_/i]`Y66GWgߚvrt]Oˋ?tټp&˯5IӦ#^4|QwJGEoGYZ (^zpLQ%bOw2=q]K?94a2.ҷ~]Sx:Rw>ސt%,B}x[^sJkঊ j@'u4OM ^&s:MbC)Q' 2;ns:#8N1_^r4(Xi Ӡi_}O}ȗEQ\.G|wN]34p-);1Ə/ʤ:]a|}ɲ:?ƱkrE'>(Hz1+Rw{pkF^,:)GZ/,^]*/aE?IJW˞q|yD[ \wIiC1"J,_V _~l)Uh!Ρ{q/`(Gw #+9)rl&%qt@?O `kcrl+,;mna5OKu>bhMO݌dC@dI[Sh=/zf^α6X]C$8Zb-GBT- u)ꄤ '992nF(YK #&HU r8,'F*Bc9 0R2yYy ҅u5ˤK+cC6ȹ lLXD65{ 1y 7>8Uʝgf'X9ȴں*7&H I#2G26#ÔynȮ=[)نY`[K)>?*PK)(kk([vkz5QQΒIb"XE$~ l?$Ι+2Ypk%PZåy{-rz-O*34sjW+Zq0Ms4ɽ_׽44^M2M\qKD"_dݰH7-'zkp,8tBu}\Ι֐Yw A7,VY={ zo^ -ZeDR\IW[0 a?4{ S2]"~JJF1I]Rz⒡I%ThUJB֚;Yq$ H Kk9 B9d,e/84"lur*pP`Cj<=G~L)ې8/}H#5a2[kz`j+JSyx@Go3GGyac0 i-+[,ѲL,!E|)Hd4$2_ Ed*8ӥeY Q9EN43doӚo+BJtJ]^T,6fhdW'HdvDkq ^{PP-7BA/FG0_iX}ώ: vDo@FKB̅V&Õ&u򹬦|.B[>a'N@ I8dLՖkYH9 5}JTUYXf('Ңq"\}IT&]V#gU>w*SWgWtT)5rJ.I $.DhT@=5fɥ9[!td2"l%}V|JE!!R0X{S9u~A/^Zj|{O@:Ϯ/udQ`mD'}0rk턾>9jd5qUp'uLօmY"[026ԉdrFO:E,3u㽴5*V\!)K.H`*S.ڜL`,LJfFvXTӅ8cW](*B~Ѕ^$馼 5ˆ'hƣn5d?)`^;'KL`* #/IzLX{aebyMEmJm!d1@rv}Q;L'i];Ek> ؍E#20 IY$RD)y^_I9eSb*!I]U}UB&ā(ѐGG-$OpĒ.r)LF5+jlׇQL$ZjDYY#ʃFhN }Ί6bLY,y y,UՈr6&P"p \LH!Z@6FZp%FFvxunYt:qɮzQU֋pЋd9dYA"/]`:xfQS,r Q9^| /w>1LTؖzC ~ܺ8:mN2QOp;x?#5T_Jh8uNw}ۅg5KIόEˠ#1B搓>;9;_} 1mf%L`@!%}P˞(Ut[t: JޅT"h ٽt-rbUr12D+5 RLBZ! ʄ̠ O,\d1Ȥ"}p~qH|Iz" WJl=.Z!M,մE9p+EŽ)[_U<^!陌##P5 qLEV%KwjH򽓕9O^~{b3t}RzS*e)AҙFW W2*Xm8~"BQ?AO1g9PY+sB<$ъ1@( U4`FR=}bqwJM4FT:% %6JA^8-tQGbo!j.nnu#븹*#IP~0s x8NĘ">$+[=!)#ȡDX YUzGMiSXi/Qœ2#̋( ٤ť2k(`gMGa&5ِnE2@%b:cuHe%YX:G  9nŶs0_wڢ}㎯:ՙ^8_DrH88< ֝M*FGl%DNQ4>g!eZ Xlxx|(ူz9$▇背Q#-06xQqHs.*Q0IYc*AYJy>%dKȷppKCY}0ΏjVf Σ %z/(TacePARjI:5(;#@LKP̮G1tDAzqJ:zخ2/)D@4:&E#aQi$(a   X_]c/X~,*U~/>d fmẗ́)= q­!$c+#S 21ZNVhzѽxQ&G82*UQD(v$DTf sT ox ⁑H@u N ,YdiJ9%읈 - (rd"7̻n]HݭAv֐!2~pfݞ/ 4SVI6NuG!QY}gkܧRbиq>?'%bx?d(bQSH3N1`@2 ts.a:f<wO@ ^vBOgտJ`Vtb&,\:X3\M|f9$ cq:_ ѧp M!Jf+k"qQNN*G޸㚼Vgk1(1"-TA7Lff4{;+_O50S|ctr1d1̥^\Y+d B2O.Lڞ_e]7Im7ˤЇQiI>n/^v'Lp\+AZM׺*$q  9KMO b,GAWzOqAhTM|(Ǚ/gN.ۏgo޽?DoV` !DDGt-5Z9ق7藐6\=I mOւ(=~~~ocRTŭGlYeqH 3_} l~Yh͚(FQ %"D ]cKP͓Hťz6x29ѓf <`a/Z T{#QyIV99F59IOmpDFT&=w}%̕iFs au1'|QdHi:5\PR}L*En"06̔WI@o Bmyg%fSjb`_Yc:~3+Qlgo L]sއ%\) 3$5i#NZ(ڜ-rm|̓5;lܰZd\2;Ȧqq%!ͿuZoѪu0aˊwyy5ߨ6`Z1F: lp6HF$@SJxX@U=) Sʊ:*]lR۴ oC " k8|( ,mXB w2붾ܛOubƾa["{(6_ <Ȳ.rzsZ2Fl- iiJli,{3 ;lhNup9 Әs5eaR88X]q= n%h/eS,$-H8HhJoA*sNpJ$1 0@ZQ1G EέRȰ, Ƹ#aREbض5ᬷTnYBS].Ќkڌ6kr~ǡe^ >MBLAcUYgFr%BE=>xYp憽*gV/eh˨eA2@+kSqr4p*c8Q,U-C0!B !x0k5f,`k1hFs+Rtki>+(k=w[#V1%{=Be")i~Z@0`؅=r^Y|d|8өj:'Ys%9ƑEP&\VX=Oi6[o堟-Hll!(Se .,Z(6;XS@W!L0(!V6 elW_7>1-ѭFQ ?fˑK-vFટ/)ipe&qtrIyQ6öPeM"V捻ft3OMZ5 7hp 5)oƓpt}Wnz!6KKZl:6Nev|hpөO ߅\LZe#sxz;/~yXxMw֧p5gڼه^ˋ7D/L0 hzppbMh[Ʌ,ܸ0㜓f5զ5u~)-dr"L:$ಭ` |vg )ngQ݃;QI (^֐٨犺C/!0:t `o4"QEsNyF'בS-ɸÂ1FLp,CCz{qpnC EQ)59fƉhEb2RBUn3!2pvZcȉB8FSVE#FyR;mTm[[cuOETcK\nk$b3a ~ |<̘7Y@-Ri4842{ R7i$`B &@pI4s5uZ$T{hPyoh~n[Oip9L(CnR(7HJvv?g]9NnRG7_` X烨K*Ljd<)V2""&ZH0<)c"ҏu&R,iPZ2ThqFhֻ@l2~}gy-۠c e & ¼&BP5;LjnDh:vF6x$y)hW: 3)i=0ϣP8"?P;\*w S * w@'Q`T"$P2j\q|c~8JXR~iG<y,![nu 'i+ʗnFr3:( hB>M\{J8/4&.j޳zwo\TYQ$K$٧L*_0`$%/p-Jm3eQV,܇##x>9. y2 >*X\Kh$Lft_0 8&s$ 0R PXbR~d'uJ&ٷ~Zo dtK6+X>L> +įKz[b>U+NmĨ4=,V1+::rim.hyrȵ!,Ҩbw-@UwNB C@Ia;DB(kB.0f:-S鄡vI7{6ۡm\-5lA-rҚ^QxC/չ7#pEͣ&I>Dk}k'{]viKM|>VR|+lauN#/"jȫ#gƚ!mDZaE1$1:;3z;eMb dks+d |hRo&K:xvN^t̏u^ Ny=5~_*T85^6:ފĚ B*mY`;qCw{~5-MdJ=ә}k͉c҉zXRahFRnKhj%W⿹uSv}l-nKF6׽@t{ z[G58؆4GMh28+QRHܚUnF{m,[I3hפڽXKN`=L&qhճPT lB&IRhđ ;Ōʃt<혉&h@)iGڳ?u, ђ0U4"iޅQϴ{p+Y0')PBA4{26}oۙ~(L0@M4\g~؋;_2lfbM|O5UzymxidX<(RaT" *ޔx$_pBmR4(h{K/Z~T.\ҽx ֪|~q4+9/ Q\_AwR,[v͸4\ 0z^md6,5=rܴomv`agwGɍ46Y&oɍܪl.3I[6`yUVW##X,FȘ{JQc3Fⵧ#T@O{ - ~- 3"D+3(j RH3Z3&*yȹ7⬯Mꋆ*b됦cfѯWB_/TX,ofr5Gs)>@$ Ky;t{JS=|<ϤG#Eߋu@/25͖@($9P(IqFMFm8҃RNKzsO;')h[8_vkOZ#_zoD_4/X :$}@!ǜ!pI J42A"c"l@! 9_B򜥁!@sγ` `{$;yJ/*Dtγjف5OOQ:ܾ`K=^9b^> A@AƲg{"B ɢ %*+ > x<4DO9HM$b|d'l#Yjsa3w3I~:.a<-xeZ/_=a- j&R=qW[^) &v=`YS3Cil{fӎ\shƳ#,8c ^E`8Ihs KT`ZG- ?e#jlg+4Ѕ߀y~/t|@_ E 3%!0Q]A@[,UzD 8p%7+Ч_CƍCn!Ϥ91<)ą{/kz5=^ӷ,}<#J>VQ}D]\bh%_R(K+e9ըHA=ܣGs{fKE-s:}6Eg!$X JP"? 9&lx#)*DeppB('OCټjky^]ٮ «mG/ ,d!Um7L Zo^}_x>5x9i,A+%/ċMJBܳ ؕE%xd9o پG3G2tN[&cdC-[0vc Ěthը{7X ?Mw7F-{Ws{@Q^XTV;?Z wg9X|qƶʊU!Vٲ4f :*f gd?էxy\ҩ_]63M}̀K!t6-Ot>)j/Թ2;|U}{eà(oLaACG }끐c-EƘj"'ԴÖ:nv8!L(Hk~Й }3f{V~[ܹݟ+j˃Uӻnzr@sM0!md`e4N]LDRX'5!Úh .F &88 p. ւ4K7FΎWww}9J9"V ;UPb%sd6i.ջU2/_[`uwZ%VDH<5yJu|0o>" F'P+aWv` _T>Q7y x6+e,P/59JZ Q/!1D$%rc{c TL^a|ů׾˱˱>oɶ\^jSjWcݤdWIԧͣr>9?OK#A䅴IN!) #-#$21 :m;]MIUіw҃.1]zԄCM֠J'}@X*~Fu<"`%TInшj{QC/nzӽHIͿo꛹mi:2f7&O׭`1V4`܏(oK}Bۥo}g"7oJjC>g,5JMd 2ʩCu:jK/P4T{q+#9u @&ьKn *S hMN`(rg&=g@XSWΟj|/=t}Ml>}5Q\X&MVTBXn*m)W^`,$j9!$DIU )E2z'0D.NU$[*cB┡7]ܵD/mn$a5>iFHr}+<45hDYcz4Yͮ*XJ<4)ULm),eS1Ca%Z36#~4Ӆ8c[]B9'fy~Qo}x#ӫiW߾]_)['Y4`ZcԖb\G+)K ar4AA^jMU& F*maN8_zB6kd]ӈVܯdWL]͸cW`WPr-s섣Z۲1H"rvSN2ȶ1+![TꙘcrH@.%Z`%8ںT+r׈zqͅk|͸d[Exe ~DјV"L ,&d1@ . z)fܱ+p,'Pax5 tz?cn ݕdfa}ApC3E?Z~Gww-t`MPu v6[Q+`rs2hgՍWYD{ں1OW<|@Έ1H%'[x@.Ɗ5ex׭dl1c.!u Êa/n=o}^Rm8g*Z3gzw hAEOğQm9Zvk}Q+Q ~_ߘ|VZklVߩƇ> ޺;INXiݷw1_*ժ\ɣ;?F}{^.8>wfCʼn淽ky?o>ڰWFg[\3olXpVK\[ .$pRzPa6Պ=E~AyD3ohg*5c-e[[vtđTYXRIq9XFiSdJZlEA(x,n6l!&_FΦxHYƇ;@,` 6q&`Eg7mBE(!wH޾q 3ؤ n@iFߓVߊ'F-_DJtbf`)!Zɖ7R2H8<%_ע-*51vJ[+T3Z^8|VL&,fe^a^RU~ٓ5o:rHDp2DK-ȚM '@uL){EwJ>DV[1O._LkIEel+kHY$Nپ8,βA4&%wNPy4k,Y&ǻ%깅ul4jWMR y;1ɂNȄUK*f_J{7jcmMMXs"8s"St2bLDUj( 7yo Hm|~hϠk/cTOB22P ziVyg*1@ڧJ ōDI\Ÿ5O)$h%G8Ϝ |b\3/Gw}h*tFKF,sO :4$}=zaqY<f89֮Ѹ'-#;{QS:W{sV'{twgd*Gor1Pm,k+^t0P}mY{o/IȔGa4wj^WWo]_??9јSW^dOmQKovF^^4>;R=ߠ}3Dѣ^_{7gW/|}qvWb,c%1'ᷣq9>:|;[Y+ aFxQ0^.1B!ͤڙP>ɺiiTkYe,Ɠ^GrbspJ˵*q\7geSӚ-J kfNRŔWUOܓ,-53qS<͎Ƴˣtt?W|mT_^ku+pN7$GGD'LmZM-L[>|7WfGJA{;k/@oF?^8s4M`?-{&[GtUu=Σb~[#oSZܫb;Ш>F%d'Nl$L¢ASAkTqXVؔ줏pX8Yv<0ׂKJW#!>]u^:Muv*O|&ۜ|a:=t عV ;j2tw:si]wW+|Ql~u2_ ` &Wj}=|yG80?|?e}+]?^|͋~/~:%VW 灯k9لJ5kUHC*Zz4z f2$BxX4F[y׭_˳mH :^Ix =OP`ȵMJkBжQQ]~5Vp\$OgC!lɄBr-=&_)J9"JBsK9A"ErUغ I5ȹ3NPEh4AI$ιTT)k׌q^;ۺIy;P8iՃz/}F3! 3AL]#tGk&}BYj*a"`Z( ah˘P,H1{H0&𯞇vlw}5c]KYLɮmꏻ=381g4X Qv]NR;?;=. tL1vEX[]rNM^ap^YO7® !uCHݽ!uסc*;blz&HejԾx7]t2w^I ܁ C .L) aOOZpLumpSzbH䷫bOɯAT*B6%TuA* ] d)v (t: `E~CibPZuTyv`~GgZȑ"ܗámU|Xego$L%${z%ɲӲe$vT]UUO_ߺdɻM䐅BUT:I%<8:D+B4B2} ͚䅳st;7lM7aɄ܄iǯ?vah2d,AdÖ5T$%;Q@(ZL'k `Ni KKJeښ(rD/B8M΃ftaUߥ.$2,ykεoy1*⃂ߜ,\7|1 sHV(YK 4nLKnwoïF{9;"Yz WC)8 ٥HT먩(KT1HЩQX<כth2V+LJh :c"K~бuFΞrv|J`T|,q  ],D ά3KH+G4(3\`EK|-T{niĬt\/z4N - hN!Nmo$u5`Gw0JqqɤVVy YB(՝m5B=4G'㨺t+vpMΓBNj@*$S %DV}<"L[VK]u"> :qGSmE+A 1i6הu255n6%RI+AV$db;5kxX396pv7!M;Yy Z QPV }xQK6غm.4.`AT bc&IE\*Qtm}ߪYHY(!sH u.m)^xS<0Eu}`w{ZȖ[ @'CYr2z )QT{`&' aQ&GACqQl?QWR-{gkl4IzYXBu"?Y jI,RV޻{VZpNb&I2Q d{C{Az =' +h0Faz fwg ƈ3)3@o)j~{jk?5ok_C扲Jb[AW.F5>`k-1%O3'0M>\ΆՕRX%OwSFm"*Q=?3x"-#:Qtf|ϾOoV?9 VI>ʐx!)MEq \ 2Lh^ɗҼ_Οǁ2 `m}T$dJ_oyVys:?^QW$3'\}ӃiқM/W|-$O?\Aoоxbxu[7_W~j/p1up˘cMy2O$h/mOa^ڳ<= 7Τ֙Po|4j4J\gw,GVG U;xtuг9SZnUݳr۬۞M!k׳jg^IHX3qR+No aZשּׂ|a8[^jdxXAWU#+L6yk-דEKoS;*IԪq0`1j#Jnc촑-Nv~߹QX4u9ȠJ2Ʀ5xT*,hU*ZM NzjJaƉYQXC_و"'B,5rrBj$sdrS_:cɹQv$1;C׀ y ʍ37mƝƐeuuK-a#6\5f+*9 o'»DoAvt4jK!zeQ{K@7 an}L )e;댏y|!Sh6Kn 6i|T% %~L7%FFZvHI}ä{&f-~9;I7i=@#:GZ4bdU_H=?_wN2@>'(&C4JRǜIlɭꈒ[UWɭJw0XVIє e+SJF"`Yu!D:b/(MbuR"(s.+t댜qrKN}_l k/55W}\F3p"OfҼ >zLQͲ/G&y x5 jbAl,YQ+H𲇗/[=xHZ!%%0t-߉*H"TE%s^kL8E6&C}l!grE'}QBb-y[$!xMN1쌜Z# Ya_(E&L Qɖj3ud7=Y]v]$S7B F8j.ZtWXZx{OTt9 GlVt)ZnN-T78?@9F F k,K^$"61/oMTl~m}3q<eY$bUS9ا& ֩|(]b| ¼~0P.,^D1_L߰Bsx ْ1\QU;Щr\' #F癆tiqCi b2uמY|(h&[}lh>ڦ_mI k0.Xlv5<dzjq7\͖r_g|^S[YrK_ׄ뒮JXKNfc6޾jug6TŶEG_G7ѻo9sK\<͒[s߼jZ_|/ :L[O#Gr͚_ 7sK<57<\Ӭ=Z;~L6{BPw鿋i L_,M RT(}PFK\/rz,-EdEYԚV&*%\ûZj,E,dmBAp E]0.Rz@ev\F$B(w7댜= ҩn؁8&dEFX6 (jKl'Gf,Cg@o 6v"GU3x./}Uag2B!Fexm:1DMq6]lbu 1Z;H7HZ {+ $)9B` oaI֬Aey<K)^Qh6 ;o֦}ƨu̵BcHx-*cYdTC :\Fy Y'1) 9t:3r* }#k)%UR;+hd(^PchA "{a9LvzBs>0yH Y k feL(JaΊHWC66b2ΏFSn{w=cCg("tmN trV:p]2U o569t ^G=["^nW bUA3EA[R=rK^@SfUȼ^;b&("=x:Lj7CLS袍7Eip+vP@e^<D=XT\ı^Җ%[&{_ߺ-w1 m! 1@ūtJxpuV 3id(uǛ5 g#j]7aɄ8x[/9@M']#[3\i^m7ijU}1#-K)=jS2#Qjr2fKٸHe-O9D+);d:\q $^9˂ ɒMJ:_5dצؔN%8yKt1x C .{͖e,.)ǖVd P$B9KF 世)^|Uî-.-KeO0uxbຽ_>/H^dR(9I#dG#Eߓّ>ZUH<{QkнeĤ9΄+!(ÖD,eKg@f:ZrXvAՌagᲒa6w8'E~Dpơ/pr9I'V KA1C*eq@|&-?{ƎAQ IWn{m@7AF?[3#[-Y)GI&@lyf!gx>~<<>G`A pD} 0<0hHBd3b,4x1gYKBL3+cQ` >P1@CV 5A)y1;Gb% zcŪC wFsWz ;ƛsqb^x65ɢlirC{fu ÒV$h4, qvG"J F^H4 гzސќ Fi'-q#*!^A%y`^y&0)<[?~|gWFoF$Iks:OH$\ ƱK h:2Ya"9jptY9&osjfw%C&CUHQin*,ėJx6E^{DE j5B$LIj-&$yK9ׂt%%`a|M^5l0O]pF:P pmJ.Dq|m1McN }bnNUtIO6ok^7%i^Ui9?Ɠ wgozt'e<_/G_b%Y6,͗gز(/o/c_Pq-l($m&x5,Ejr7vjgaTQCqqrrޢU2UWk=_ZkB=6?pAP=YST&)QpÇl/G֟Ʀ“U5mn1s9#}>td[]3 ?Vby7 tvtc} :=7?u7SW_{}yeCȝzۀEOڭ=fwSCզ\_iD&gI({?'E;8s< QJяc?l|K⫍MKۜ~>_ R:IT^q89Nm8)b69^-Awz*NW>pKW`놢\WW=^vFAk0&f}yln+Jgzjw_^}˚+QbNcwΦr3Mƣ.񆅳gjCrx4' /I.V3DXRQK5̽Bo,VXB f9n)ILR& 107ZS`<`F4qǭT Wri\i mNOV!xb:wM߼ 8J:ýa=p{%;X$b8phjL5W5xmk#$Y22Ùb+MK󠢩 G4uY d]sʑ{DX'ZH''.4@M~`5OWIolVC}$&z|h-lQ픪Njь/,rh2x _\0R'11et$|=L;G|lيU#7mN:~a`yPj_:ѡKUnV eT|:edME WU Ƹh*=Txrx\],?7ڝ_d%+GqyzXn:B:A_^=6Ig_?}ѭjN/XݫX;![)JpV&wkGX,%`?rgfe.r .6/¡ \ydJ2BFDmA|F`EK*);㪭nHFǵ@sdqMD`wDR&).^EJX_曯P

xWV^۲02,}1lJL_v3ULj; ӋdVkaq(KHnA KH!pNL09W+ر]M"m V!C%2;_DN Hu?O:ӆjgEEl몃;JTp0w#>x@-2b|^B++"Xk"j/*N@B>-^k=)-+#&IέAPN cQ$%54hFpkM'6DB-ME,O9c$ F8BdiWxXuvM2:M>D)۲3I 3_vGuG/Y92(KLAq`G_\M4s3qZXcOK,q$O'=q02?D)z6J(-dDfpB`CR/LA$Rp !$ Y|+J4 [ύ$!\bJQ9?SRJo+V)]:&JD.OrK+|%#@bGݯچn6L773 STV`L j*ϽρDɑkʣUDzoϺ'9A k.池vq,ě  !hcyΩVA@D ;ja:Xﴠ2' E{b K$ǘBXTGema<,VuWI30MbRc_D8 V-qi$2!^(mJuRx4Q* YO$ED)R #)$( X8*Xk[E睏XuGīӈ!.6+yc,KEQax&%~&$ #&艶h"S(C ,vEJ$'\<. ǂ{n8=頵c䣳2kP=JF<7D?~G(T>JpODx[7ex,&P;pI!9;*Q C "]rTZΤ>Ygu"##T* !99~WVT$|o@7hdqq,EPK VH &ХŊsr~z{n0mf>s3Lv9Z^?'0u3կW|;k"9[#*v暠rV$6ɚ#=2"i44dض| "qYΎDdKy"kDD5WFFjrDLb),%$C4W fP\{` Dv\6 ,h5T"M&ŪƝ^,矮rIVl'ǯ,KOo9I, ͒:8 sm6R/L_N#/^|B}(wnO3p׋գ}o:܊?߸kgp /&ȶї . j4"r2£4<] $i۞\Om0/@R;֮=,]h[\P2=\'<&{U\v+ˢ :⺝³Nq]zzCTɲ yHR4 ń!,R(ueل翀lD#5g;ۭ;oV5pʌign]Ri9(P,ժuQHcf3de* W0VǘDhI{eXzȄl43X0 Yu2}#cJ8_S i`+/"Kh2-}mm7 ]mI%$IdDA+\a4q.6iĨ2ZW5hщR[x8Ǹ @3 ruqRd %ZZ`#<%} ڸOHNOMʆb7MmT%nYUC Z9pxYޱj @z`<g oCnL`" T u+)fe /)vTLbgV UQc%Gji"jJ()46 Ja5V6Xg Q]\AtBÁf=0S7)ہfuZtkH繨)8<{,8z;f c\`ʋo>*TcQD[9DKh wzBH>"Kkp'r5cl;zG-IA>^LW؊"D W}\N.BNa|;?$ QV2Zة ,J1<*QGb'Um;Eg=kuc @2=BFmX)+ѳdD).XZ#eȍL 3Qc |,&o:DY,JLvLGjJҸh '5#Ϗ;jyѦgO}(ѫ`QiƲvǔp(Jf:z ںO#8%0`*~l765fC1I`U.uXv)&d,`VM6=E‡PD` ЅKq9, .]lwI3q1yUvիm~u^gq6ylIuHӡS!Σ2OSo7tACl쌨S; _1t,h%B{yԮUzOA=TԏI ڪ(LJ@F@ߑ{p@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H UYZ;F UX@=Q%Pғ{TY6:@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H *p~qN{|rr1\Stƶn7;׋xÔT7vֆ:vL֒BnY6S;+﷮$(@쯳ɿCZ/:_W?L^Lvi٪`expNxr0ϝ(#>\y_%O]Յir|ch|>Ylz9yD>p#=~a›li˭X]XC!!whU@K@guSdnx=뽂I#@gBsηmx7ypYS>ס9:K9`8䇏[ ~ vdϖ痾n>/l 6trௗC?##ocw-m@[c4֭ od&i7{qz9t ޴.yVGZT5hJgc'\8 GWB9r17Ziq{Kd N}orMz#`U; t6x:f%TېfA{7Ηf1ntktyPoaRn ƹt;3P PEyq\O^[ ,R'n%kR\$RJ`:p "5 '4PE(#g2Jp6rKVl2z;>[jWI=uB{MX&b񪫶|;ӵ=6XtwTzn OVqt/c<1B 8/FEcSN!hh_E1bP,]ӽ{DK#ұ @9bZ;a-􂆪TVD#BVŐޞ!9Ce=' @Hw[ҴSր~QMn5v7"(-v Ou ȇ W<$*ɀ-yFEL` -4 KœGLfkQ=y4Ll*9&'eeA.'IpݸqCMq _Ys|; 8 DMH&!P`ix b P4\mKɋOևТGf9CCz(mٍ~bg /PIJOa:_xsbf7,5}h Ggoq(v!79' RzOfM K}ZWRc  !} }7[t?Lk_ƃ}o8Wf+^-1}:|gGAU$a=ULsLE[6 pJ‍P`G+^kcy2!$ RS-y!\"'^qpTD6w5Gs:,c8:/k!4b!C!DExn#zTwEQ5.qhҢPxswRR&_)d6/Q.WϿv?~RN9b7߼y֛ozś_$J޼)޼_> +xQ+*o,x]&|8m㑃y`6p=lޘ)|kxܛf>#\=5. ݣCfu`\s8 pC2`sFJ˲ 0t]Q:Ю|orעbL܇M+Kޝ]x>jMB\RՃ&3դ*f0Ƹb*T0Q}u_A3 }7QMjem-BX{aӢ]ƾ;N- 6 [ƒ)\S>ׁpcz uٕ-vOD*4?]Tz;J"o=V]5^$Jw~%cV[k} )Y_M&>k*Tn|?и0J"~_ _~*U(E뇑ؽx䮰Jb+W M"Z*-`oo2UקOZ Z=]xvƧ]$i>>\ FUe ӎ^.ˈD('Mi |0jveIwXR5,3nF΁rԓK &p80%ER H!ǘ޹Q()ToF927Kop(<(*ofTpɱ#2\*밊a,%VvUG6 70,|GUemyN2W.cFoT-vqH2b#3GB (F\ddl)@3eMͻސ=C[Dlhu4bO+_|Bv][Mv:=dۺ z!:m \+0"O@/LLj9+j$^åTn;Yb>~i}},ƾ6IZ|qwbҢyk/Ѡ(~&}Nj߫oՇjrW2&ڷ&rKz=hU {(X|6LNO9#פO<2 s2Sy< EXz3҃ ;|@WzX NV=~PreQjWr~5qRJ/mPq~ARL.%E.P/Y\K(9(HoRS{Epk l<[cR;ĬW逰 IVk/嘋Jx->c|\>RHgxQIK)sao܈%{^~S]/,NFyb4c0+ٸ 59-`-AƒEiC]n<~$ O 'DqGT\xDY7>!TDϣ [&"h$3+lVHD9a6ȹMXGS1rbrX+B"se#g?$z wH;EA}ڗ {/JЇ*,PrUޟKl|'qdԧJlo% <9o=QMKMz=ž.&nVp,xm1"Va$F^GUAx(,`@'hgb ɼ^BRe-JB))&zSNqSo0DPR5c6rk|X%.BY. 8 sgX{ZF ,NwBi(.ޏ6crq5 wLY[f9ӵ(12(J&Bk*0^\O(/;b E&h0YsX"Bk.-’99 v>3R 5QQl=N;[B7zfOY0 :9z pjLmc<f$~.GoԻ܎?GsJFL b.x*xѝl`*/`̕?{WFp_IV-n.;{}b EںؒGx~[oe+qH)d*ˇ>YQB1 ojچ&(ad-hD@ఁ,C{ٽh4ƈفcdӥf.Uw1h ?TcZ߈n*PYsuۉCP1Tި*Y:$Cg 9;bHl-9n.ޏkŘu|2yXo:'}P]DJr箘-FTYJPRBj=r1oP*#*f^U~_cZ^Fb^YE.|2ϤZZ2R RX&#+hl㲆E~*t򒞩IJ8˨IJSby/Z^De':`ptJ{_\R*CM EEJ[irtdUnFc2'2Յ ]Bx%>hq1KL};FA,CRGϗqa;ޮAT^{6QbNR@.Z c@9gd%W"YԼu?8 8wm8xc('1sh XP(d.ER'jPQ(bjhNqW>  xL!y#g&BT^l`RRE;L@󃇞7rv䳖~جݨӓ6*?>   ],%esIi^r]|LAC23Uk ˁd_ 3bĬt\kd-h(ko0 a8!:YJCjГv>kGFqɤTVy YB(Փu5B6G/cמ'DZ}!Z?w9CIIBR! '5)wDZ\-.jn y,"L,7>`Ed>2ߦ. & tBL5e] g`M`M 0fTJI9c+sl_;иEA/`M;4mf|QNHTWClL&%i&kwg7ۓJ.D `LSM&epJo=i^=Q^"dEfJRBBG e[HD cq:ljtj!-E&&NikeYA-OBt>adE e,w>1a#M*_EcYn.yDd6k`4@NFo!1%2>D!EQ̃_ƳֱN@V5YY$NٶP8βYhl` n'"k?bV̺u65&)Bd81ɂNȄ,UK*dɗ YjkM;݋%@"8s"S4XLIU_KdKm]N*a&tA鑌OTW|`j.Q[^;SX4Fӟ!H7H; 7.|~}+VGӸ?cckrqupv11dži<)ĝ5 oϗUON^fj(DHZwλQ(:U0RL[3Xgz1=/UG]dר]ʪQ549yy:K9K2cy$GegRLkpOߗ'ǣ48_ݏӏ7?_~?y'{^{|g` G$/&o_DMx>_M_Cx.C>z5;T1g#V2ϗ~ПTɯGlٞDY^ h&ՖMGZuRT %ZA1P'@KuXQzdbFO(,̺d%{cS%,Ɛ:eHΨ (Wk?BљiUejҨ¼+="ަ5@u^ouhaS.6tASq;7HH!$<~Cp2)FYYfE#NAh 6i|T! %ƌvPnDNse0棏rҞ_jiu3]lgE ]wv_>`Z1pN$0g!,EſU IF8J3ĸRZ*D;acnP%ػۋimfiy%uH&eIzVo >|bR: B44B8zX\a nU0a`id-$HKh`)%AYKf |̬0ɬzƁRLhm`1:)d9_ePF8uկKC**Uyw=ySc:P򦇒7z[& 3AL!?>^hQn5Q1TM,%+q%^z%ܡ=i^klJ5}' 0nR!9z d9ltQNϙ"JB^R1%oB2<{C@kDՐu%PH0!t&(8G%[Vk0".֑Tt?L߅x%4f|;zs٢ḃNEx2jgi}@gtopj*>^^w>tE*7o𮱬.yPT}=~AQq{t_<B>NF[M/VA{j>g0G@]ѯ-{@C]M/+N8 lD&k{0} M4dKxMEK+Rg{# ~?j_Z^DdE5UJ 6F!l. s1h0PE c\BeQCP"k*W뽑0I=z#gdwK~؃qqE,Zem@PԦN:4j-iH mI2P T&ķ '_@, wAz&KkiʨWov'}+ICgm&f[cX_1 ַPPE8=iyVp%HSr*PU’Qx1R$xϧ6.Я /Cƨu5BcH¢B1YKX+D)g9 eDLGu>xpB]ȑsI'Kyz# 2x8PfZQ*/1XF#CƆ-F T5H=0S0,/shiڧ&/i dͲ>l E ̂|Y<Ck]*x:B\LgC1AF (wyf0=a΢WN9]#bf18+=j.blʪa@ :{AIG>["B=Ȑ1h`Ơ+AsEϾAG R9 ?U(zmx;]>'=')v2S& OL,671QND4ݠ W==ϔ::E{'mJ[#O Ş(7?/Q̫oǡ4qh(W{#knڭ&]o>1y# ҖPH^Tٻ޶,W~d쇙`a4qF4ijHʖ ĖRH{뜪2bV:.yR𠢷XB$/\\CLg;m:FI#ݪb4#oy;7~7m-tt{@ӻ(?o*'⭴쨑j>f3R+I.i ֫Y[i;—׭ɪ:s?hHUӖgY{Oe‹ɴ7U|5*I5r[wק՟Pˡ"A\7w"OHKvUt GiXR>ԚAN$Oxo+#T@luby^yZ*3=Az-*Tct)x>Ix$_]xUyJ=ޯ] "S71nfu?o zfy?/ЇX26?۳4nWWSheBQ14JY_-UiIT,c#fKBDp XASd3 ՚1wT)WX g?=iӺs/V{f+ݐ˲~ے;fDpai2)SnkUy IG5g_쵴ԟ0AylhhDc H52E p 2xcOLxA2,NPc{Fe/c_ KF z/x$ѠbYcg"B P%*+ Hx`0O*pΒ ^-v~羒;;(H 8ji ;m﫿-mlݩ߮鯨S1eJeu>Z\\0{g,sv'n&Y[]ڍӏQ- Qۤ NYsnbБXGiL  EE-67< | 9M<{l6A DFcRm [bntW(Xbq,V V{@w ,C^9 `?RHd°eN?e 7 d@8h5x)h1$ՠ]ņ~e <}-"/l`%B$JˠƜ"s1IZ!B#5@C"e[CZhA 6)MyԂlg|P`BQ3T\eE,6h1u},l`1GR,$ED599"!& BS i!bq,Pg`¶Eu^u5El2,wEwp#eN%8]Dg n;Q  qi"E5;Q<@v '$WbNYD佥t䷵߀Srl+ss4p o6?7d?"%Bvfz(W«saZ`@Z)!Bu(R> 9Zjv(#6-*r~@c{=ٽR:dS`Ar LDf_ XJxvL 2ısJe%OV/Ufs2)ŮP-E'dN.o]ǹ3Ay9!F Swk|䰞}*5Ely]wEJTq l9^p 0WH[Dl56]`\m5K;f9̭gu( 1qs\,9 __/\h& aIvDriZQNAB*yQv$"<ylCNHII9+OPE#]UN JbA<$bhDXQP9IFIyE?U$[*{y\T5|i`4<)TuN>^P'?-d:[|C̦EK}{_{U3U4I O~8yɏ}'y$''1Zt< 7+^bR,f=^DkWsįg]K7Ï:^B鍻N^Ix z+#)H4V,d;fg}1{SjrIhpTӄod>Tx} 88ibupiӜV=#DL'1]V+q5 un.m2飏AQUhw w탐m5Sևezd n{Wqk@X~e=*q0v8!LhLeF\TpuĹ`[wudWE?Xo͗fRͰ9޶G.^h\ݮYNiE ӄG F9K*`!TL928x:g d1tB練Kͥvﳼż\)25R>`u9*@n/pXM]94oLHĄTiG߽{mlo~n-V2mV{nI[?nʶi{%\owl?^u!aIgjfy^o<&}"$g҂ O.+u gdrit x37ShDw3~]jɯOűTͅi5#ry3Nf^jvzCwk.]n#iˁ_Giq%@i>R^N~P+ l\:4^!1e5Su@AtF#sZ&YdϞY!>)$aCtYKuL79p=KЪl5wuɏCtDaLuah|zXp Ϻ-i_2`nâ%٭ñ#j>?:\O LjI.HDg45xY,@,Hd^ D Qd*Jf҉3m!+ʲ)jsRYhHYgހ3&mԳT )yL,+(vxW(X;km"CцudgyݕB:'^O2'|cMh/HqÁPC>>(C47KZxIȽX2V$s`O=6VϓSGvDIq侱xp3i'K%} RccZIaX2K.`Nzu;AP^ǔΑo+ؤ.W1/n?/?zvq|@ZXßγOvZs ^j j ffU{xCKXxXhy&hNL9 #''3@ʤKS VYXf12Cpmr&s9x'SA tt"e'h{eJ) x 9RՄlwwvA۽\^1V1;S q4.<СSO'8/]=N*ݜ\}3W#Z$5jϜ KEP,&Z{^oY>o0j,hX4GXdB`.Cqjݷ%լ} ~:sJ= kEQJH7sl x1b sE#gwү5'U; ӠYinNg3È{Z}}f^Xb޷W{zZJ-K㫻+׵CR MX[}Z[Yb\4;j]Sfu=ki9G^lr^ԲҞU_>wk-hOT_y'[ JϾȋgZ쟿6CA-dg6/im8mazKa,mڝ[ڈSaQXDBCu ՖT #*:wXǖIX$%Junr.(ṗ`ɉکdTd^&1c^i9hR hc a6&!\.Ops-22N22z뺖 N(ʣzPs^nQ9iY)&ILU!XFKULsDɤ"ҽWgp8=̉{"1W [&q6$VHKxJ5Ͷk#ϲFw!#P_ qLEVJBpDcɎg _} ج!CJ%Ǟ2J=O9lRIfZyJ\e$&d  Tbڼqg.YBsA>0g9PYb))(s}uUFhŘui Nl*0Ԋc"Io?@!$Q,N7q*䤍76rRƽ/LuQ13Fkί7uFMl7UVY׮mO)ou89F˭KQ,W St莅ŕ$ƌG9=ug݌.Ci'|Ȟ[[ŐaEPI4 >DE06&o'QB@-DT%âG-Ȏ;:coZ܋Kjv71jR7Mao (' @&J>O]e_;B$>Հՠ PkQ(w@l|5(Xb[eRD\` sBcNsO[٪Pb}5-i_t ӖmxCI.SRZt>TȤH<8#Q*bLVp&Zҭ=^jul-B$\%sciraZu@oMMIhX4`RRY/0qGZ!KYw$0AG!Ku8Q英 䘓I.RٞB}c,)KT['R4<JZŨB҄.=#=qꂯJj-J@=cәDc: НR;۳ M(, V\aUp3L3#=y[ KV6U@0IT֟ҨGW0YVolXUML| OhmxYyOgR  q5;97ڸ'~2pd2%1xu?[o5CD[9}C.x$U*(bpTᦏ x6~VMm^J,#0Umc:}k߈*~m=Wӫ8֜N/''qF&@q7tV\jjcFլy1ѯꟳ'?7~^?xζ Yɕf0W`/nk"/7Ӧl ?\.%zR'P{jFv#hfT&U3& =Y9LPWu1ɶ^J1^'%ZR:VD>?ƱfB+W|Ox5ޓbJA^&~l0=/ُ7?|ÏoθgٛS/S.[쳇g `>{]_еkoѵK׮Y|~r-cRfodmHʷ/7~ &Ћ֣\LҊtq=LUZ|Ml~55TqfT?+QR#B,!`@GWqiK紹)^{r}&}nwx7;.;[`Usם?_M9i~Ѝ+H䡶L$[|I`W5`[ gb,ڀKI+SI=!`:R+S )O9V81 4R ہ+Ȥu͋Ά8L"8W߷$jw̼7z?lpp3W/o"}R]S 40 JKaD3g466٢Ek33o';Go_IlTK zG[0xR(v|7ogoiqnfvy8?$'SAqRPwv>YS$hz9 ?s]%TEu3lί99JHomh-h\Î%hw9vߨ.`ུ $C=2A? 4I1[WrP`C+L{SfuiYQQ%w7YJO7WHU8lRN6MHA]Cfvb-cvm0m+=)wKuʖ{YEQq)W%}gZ;h7öGZBZ$ٺd{,ogW)c}=} uc<up:jpjdԊh bBd8hLc&bB1(P7`b9ؒM-!(ĬunDIk602`܆>d3/<LLGfulWl8;ܑ_]ej:(oѱ3ҌRx"5QQypKϱ:q`J;OJ Ng4;<ѿ`Ľrق{@J2Lu9j\EhY RiaS]|J+k2q.b٠w3/geL &c eXqqy{ :'ݳ&3Y:<9&s>>m{Б&9 >Pg[.P4; dJY{MKȮ^Iy<7I{cB6R^t::r(tBΙK)m~wٕҴOaqOO:W l)5F7_XEoow۩ϊu j}yhockSC0dq̙ȧq=oQ?%R)hXd-%xfVDø5e$tܨ|38 |=bqQRRyvTh#a<&%dcSF h0:E8b xCyڒ<|3hw]eVJai5Nf׌q^3ky8 X3ky8k/ː|SK|%׼䚗\kXKyUYj^rKy5/%׼䚗\.̱Ysjw͑9r5G#W!C pPsj\͑9r5G#Wsj\͑9rσ #w z% C6CN-GעVFm iTJJ90N#',!q;qkp7$qzܹ2%cIw{ p-6yӶu1RQH{yPXPGgπb Mz7sQƣQF3ɼ(?344q6a.a{~?4\>Fc2=vr8Y)m'NU.݇\G*uu+C/o|?Oe{ ާ[CZ뼺ɫRoiiަ=^$? [?۳eCAb{Hͽc>ߓ.&,мE^zE54ouƆy߄;:6a;md6M" 7ZouqSv|i6 Qb.?wKӓ͎g6$A@VKk: @%ae`JE] ԅ2 //PC2p4f+eBʕmPQmt8e ƣgG1zyti.GY}2S:pM D;jVZ ^(ުimU^B秽(PsSTUܘ%Ƶ*gE&: *(pf`32) H'iDWy-Y+we V1HL451)O=QB &g X#T>8[}o4هPywl0Sp2R)2"cVG*dT"qqBؘ)@*jDQ"iͿh>rJКSDpJ +j1qVY^~'};Zlk;m-;8'N|mewnu=u zkCo_.*Z7*By ڸD`j0`:(@O)Z [&욝*V9GtH@q,X6%nU 2c|ͧĹ2*la-dm!p41,s.vɋNML5hm?M/b Mpb1i$@ p4 ;?LB%DyQ ad"hnIJN *!#$MH}L+]bn톃D1jbYnYLzM8wÎ/,ax!zi"@ʐ^Djabnp&쎤YW#\ |ܯяOxcJ]&Nwm;qΏK;U5-"Q kUww69D&䨴;":!${N(%Ox߹+{΍Iy+jG#u\GG#u Dn@JJ#yoi]Jmwv5N?my˛3C.щ~]dSc#LvQO-#7C@LūOW oQ5& [Mk!i˒ 0z7^u(udtnGY/2<;[[9W'nNvD4 hVjN|r299/g x@1'Y}qP(I!F#K6W*RSH%( ^rk$&/D|qHi7s$&FH7&Ė@'ᨈmsrN5mzn7)4owhٴ˒j2ipëf2ۆׯ_k>Gͤ㨡An?.>_ola~Õk:մs0KY{ٻ6%WH~vM|ج`Z", >DIFٯ>U]]H&٘앿*^B'MX MN૝?~uO"xExuYk%el ~V|GYuN}RA5.΅@0LUs[rN$TLod?TxsԲ4f`z 2vrR:DƘgUW Wŷ_'wq4sʤfFa޿ m&B5$F bjB׫lpT ΞmOF'V`'B25;9W*NdWdJ ^zlPHJ8Z =fs̊;mVOI}c ZB͌p}Z9" x`dz7"_-[޼KKWO$> ҇[NK9qIWiN|Dwo~?\8bȬjr1gqֈ5K,2\>hʸTwGTGT lz1/ !(ZZ%Q>>fxvhd*^<Dy/,ǻFjc &mYB@x#@Uy9dW61!(q22X!xmRNH֒q<'Rug#x7xMa$XJAZ.gEqHEc,GUJ&Д8.I(}rvykOGZtD敱!r}` S6G,`"Ě=Y< ک@˃:&X,juGUE1 nM&p#2B{)dH+rױ#ұ1EF~=A׾ñ[lX*d[FmGcƸsBNƖ/^dtKEʰZmU} %_r :,T*Avn87}[~wi4f0ffw[ÛV:kt5}2A;_a!]&CWz9 -&kΧ'WHRU;om T:hPH.zxCzRԓGb :&ś$%ThUJB֚;Yq$ lEȃZ#c<ˀDqfWNNn l蚳qނtDqLu)^K~俹wk2/)?N7W݁(i%vk0}s+#ډGGiI/i^p\P]thٲB|5oБo4? D!y1HtL[ ,uZT"R7iϷu{mW*ŅG<& e)vjW(X;km"Cцκ_ uH[^OH2 nX?Uq#WBA7Q]C4K㧉IȽX2V$s`O=2YQ) !;,8XoIrmp>1F0,dt%0BܣNP`*IWcGT)6˅[7}~HMb}7.!+AL>lt^lQ;ѩ 6ɰmiN^6w. ߻x" !hW{lm\r,b17׍~>ַ~#0F<Ӓ<^խ%e L̥}4Iڛjh^v[(}h6/_p…U[@CAH7\l xٍM's1~X })g5'U ӠoFPn6t+:߃;rzh,e咹8{ʸ,RHg=o궓[6)\ͽf{եm81LJT)]Kos8Qv lͦB*YS%7n5]yY%.qGk%x4RP1# wvYuGn-3zƕh.,-ʖdȳ2/m8mB7azO+y,g8%mNESGYL! 5B5W[nS-dwxz,츳8,OӃEd1uX0[ˤm6 ܺ 8C0DTNɈn) &jk ǘWtڢsB87AAAH~?ӳQu՝ge =TT 2xB#JjMQ{,OȘbRj! ʄ*ZeT:#J&^sKvzv*3'퉤\!h8l=.Z!M,d+""k$mFe/&$-DEc:FXiwc=;{YWԬ6Aǒ$NKU6)[RfZyJ^.HL^8byx]ǿ/=Wl8*K,x2GO/ĤHRN; 4.\E7ZqzyL/z;u;he6km1KSUDrb)V N|1FkҪdž7u&w›~Mi7 nR=4x7g9F˭KQ,W St䎅bŕ,G9uXKzѓcQD@ޭr*Ȱ"PVF"sM Ym7 (!"*aQyޣcVqdǠIȱi-%e{"?He@BlJ&"?o*ɲO5s5h+j/ԚA&P:߳ް#a@i2@J)C"`.c0L919'-lUtm\;:jڳ#tt~Zg?#P]cZIEP'd$FeLKc Н6R{޷gw߷Ut'r#ǻglXnl_9NMc1 OC)Tar#}:n)G* -6uO28_4q;Ǔ3.h 6.FųV3@ GG22*tIӫ;k!\1qYQhy< LN6gW\klEnUwF(]͋ڸ~?ylz:8Ccՙ+i0̗Jw;ki{WS{ïjm 斮ښ.ZYw f* Fˉܶ9L N(ڪ`[]ն)d̪I-K ) +oGXMhU^IbJ@-j_ U\_8{ۿMͻ\ow}Mh Xk쓻ӓ:&DG4j[6--Ӵrԋ ߣ]!rKԧ٬m A\IyV00}03/SrrgU<(,TWۤW\N:⢊WKIJb A:JnvHwģ$xޤ  Pe{'̀;MDK.( ٺF#bVLA:I/pX.yGxdb;W2\#K'KĀQ%Z1DS^u ?\B_;H!;0{.;`U*;){祜n8M(H䡶LԴZӲc']ܓ.70 `%X#4zVl{BtLQDWBu]0 S%sp%c$PRhό`Rd2h|I&GEuq0 cKǺqLK힙F޼Z%)A᫗7P]S 4XQIBl1"˵땷]b4z~1x)*;(|7dpy6& x'+f5Dd`\f2O.)f4qZmo;p~ `L k!o6 [ge?9PНW/0C d3w@|ֲQq<߯ح-m[I "ɪ⯊Ūuڵ´ Ye^驒ۓ)6ь~ZǻXsQquuQ`^b`/A ~#m꛹}p:^ Z.1+[ ޘH{sܽ[!}';wʬKpN f>08?jRyS";puogvq#z}q\⨄@tr Fhi^%tֱN'ǂD!z4H(-dDjH$K.SIHI&HV5_:D\8scc IUI@s@xFJ}TcS`\݆fzc`Uby90^Zb\ɃBSw;sm-ۖ[--Cg ]`F&I>$0-5ܪ@qȼȬ )0T1Zģ"1MĈp"i:g(@,J@mrpkzc3rLdReoT43L`:vn[3spǹTNx ZT0ɨD#A>kja!)m5DCq$E? ?h>C!Q +0K'ShߵS+r֨tly4N#F"@*eБ1+rܦ$hQ'%ARИՑDzn4Ehmc%$AKYϸwdNG(nO>v;#g]'ڡ^ʱpm3.T/"N/x&%~&$ #&艶h"Q(C ,rEJ$'wz!θc[4@PQmUE<ȵo6Gfu&x\#nDяUycMuA=7ޖs~E!rј qₑ0h&ILF' gԹ?L5K 0Tق9*(K\z1/ E鋽\{UU.`x>yQGū.UaQcb/{WWw#5!OouE?B1zXdhn{1bb BQl 'U1L8#CZTjneb"ds[ ëA.H~pq0#|t/j/(0;>}|dKAR.t;Cz 9,>LK(eyը2$qB?'"bbVUeC?wfEuryQsMfb6 +s+EBW ob|a,{Y>'!~5q#6* }@t1g񥬵#2}wj{uAKlg:MP(.efY?EAٙϷU"Jmb"G8Q#e9=[€;m<8v(CDe( Muzg䬹EJow39懗L 8V2ฦPI D6A;$ŏrs6o$sc GbSTEǜf*;uKŀ6VZ c*TYgĝ]Tn,W5cV\O9fQr)qԋMβQtP.@Fy/ID S#C~ͱCus m]#'0[cS柝m N %Ĝ{]jVZ8: G0b^ɇdYŊcs!_{edul]RjkOĉ?#b˞Oas;ze}xW=qao%Aߛ"`淯Zv57kXk[|~[]:Z=?]YkAGj$']2[31o0 b~f'~B^Mt!dO# tӭz%NLI9 Q?{F>ޡ#U| ]~:-vp_n6,Ŗ4L毿bGԲb9Fl5.>b=AP(76eTuJE )!x5I_yM=kkE s-O(7XrycƐJAYSyq^gW7sɥ۞xI.;~r)|Pi cF;(7_tgk5\yrt3mϯvaލnֶ߿V9YQnםW 2Ee 2 Y lQXYd#8C+Bt=) UgEE<^ܚ h7S2>լ4xG>(wo$Rؐٵb[iwmzxwP$o+&0klۖ/,׫Euvg^`Aj}~H땐ΙޑMt@Ht95hn+N bD&k{п3} ]4dKdKEKlRg{|Gc)zWߍי&t4r7]m3Zz8geWKS~844Ccs鰲خvUfϳ@>.ND3i>XUjֶkJmP n0퀥a,m׽[ژ=Wj2d,A'5zN^Ґ2'-c)|-E^':$# )}ansI A[EEJ[irtdmMY(s"ASq ]BxlMj*#cq5/\ =K>_&%w޺Ky>L6`փrGP|VN') a #ރ1gNNGDXH%eyƄƹu|6"{3WC)8PȐ]ONM5NEYt(*w Y3J UPy岵^II4:c"s~3FΑ|v~|ب\_RtTf%Iyu2 )̠2W-X/nGNmĬt PR勵2 45Z70uh%?VW^AOz1~;z&C+L* Kiu,dlR=)XW!(o 8zgC=1t()9<\H*$ޡB2PUeX "֟EiWk^0_o[̇_U9ƐT8 tBL5e] g`M`M 0fTJI9cv~dsZ݋kn71nҴMva@^HTClJ&PӔ^n=$)b§B&f[c5 -?+ʋh.\Q+ ' 5Kސ/ҔqHc,H^##5Qh7;/$CBu5BV_ǐL >Eb, VՑ< zf\F!3Nc+R9r.d)wco&b4AڒC +cWQ$@%fP)8c$o|;:@ TWRa HݿU:KfM^Æ,j.Iɬ'M[B%ѠϱqJ#hK>|6Ki.|6g Q+P&*%9坌6H̹|4|'LγUV~s~5x_s(6BZE)!ڤKc(jICRhMrf`^"{o?p:wxuAz%zwa1.|FyZ@@/4A[Lノe{Ҥ<LZpPw%S18p) }21*knYDx rb"HޣS^zXB 5J1EYӏc!CoJ)Pg$;nSx]%#tyE|vQ[+;)6/80mxM^L뗊ʂGgw}D A;/"2H DРVI9i x<)zI$3;yf3q޺h#5(d'"3 y<tUţALp_Yr} *2+I,j!#FkƔ}f>N\~pΉ[f}-:xXp`)&in^CW4_9`9K*$xºP(bPIU.lKU!$:Ғ\kKVo0q.YR ] hUNYwRgh.?]Mҧ򏣲]>_$OWl<:/?՞70֠97i ~bbr(x>1j~$]\,mNFU~.64#8bѺ%FrҾ&YݬOduyߥ;]4 Ey|+<уXZ~70􂇜TGay-oצH>U|= WvG޴~l6;:BX%?* _+뗥"/ym_d%˼Qځj\~"fVryS]TDޏ8c\ݴf32<r|4WUY~Z5m(jnL''?01Kx'&at?%PQB/{f}>lsn7|1HIJI-tM1AVOoonXg=wc=mmU ho^vӯ垊_æ1l^j,gh8{RyQ st :Ʉ2rpD^`HY-B;8r$ ?ܼ0`,1X8hAR-XV:]ߗuHԙ$;Ѐۮ*dodI2%'Wx{fa6M <C2 .F}<'m+D»#QTbsKқ% 侷|Z'[wᎿœ;=`F]sSE/.2U>IH?#LGsT5#8$%VO=1O{m\n7 >%gKtiX[H%ԒWV`R8+/oI|f]6C_>2{ .ͯEtp|=7JLk.lYz3:шצ82zfI6lB5%Lց(ܬ˩眐jJӥ'6c4sAY#x6sKuGI:g_] + , >:ӓ(vjqkY}:[i·Ȧȫfnȋbu&ú/r2S;;O˽>k`!Mfz?Ʒ@ovL)Bg^.nm[q;[WNB]{s>ެ/.6W_vWmvCfQ3'GǛ~/XǻM8Ev@߳>g; ~y;l_Ŷۜ7~o}zٴKNJZvM/)R쵓J7}0Sl+)I('ot.nr׽3](zȬw8v79^L!"?"Q7V֖"u"|J/) tl_aѯȓsih\+WylN?߼1>6Lv'R{lnFocwQN8~B~Vs8n''L]ڠ8TK!_sv[/x^^X?L/KOo{v_^raiycK>D:E~Vwolﭾ\p0FLt38]By1L|uT>X*=.]t*NJ3Q>X.^yw6);Ϝ?f?<ߜAOik/ꋧnӸGO?5Ľ،fWܭe 㒪z"O#@"9zaZ&i:ݼ ۞|iy1+593y~bBmLz!ٚPwm|i:[RZ ۻ);zmz\:j<[gR}ڙr0=kVt\KD*55&\ 물urzJ> uOD2%SC O%jLlU٠7>Rj Ѫv=2_ctl VOmkZGI+\Zn jJ%qVɖX 5m0Jk۫νd| 0i-6Y"nȼaP|gbl+Utt`@I]nJu"$\п 5x34{M@9رj:U5=Lƣ5ّ3͍V'kC[*F)]{@=8)댅7 P&]JR!(DM| 1TaU{bg7*>#%)R"{pͬ+O||<ܾJ1eֆH; $C*f I<ys=uTtL͜[k9PJ8VRR=$Nd4auFS3α2k 7ƪPYm<F"QUЗB NZBmS&!*&E1(h-i氡mlVGY AJ*:YT b1j±f6b+K˔ĵd5 ŅuֺFgz|Z8ma0hmIc(Jl@5FP6 Qkn *Fe3pg!1sk!$" KD"t<źΧ :i,by$`&ĎJ !.` "3N u]r5Č.QM2Y:@@sJ !z(Q.(ʨqƢ+D7̺q"x)Z6NՆ?u i4֯PPcTi2uyk q y]"JKhrV4hʚ9͐Fʰ!Zѵ?Vb iABtѸ~nk,B1#Eipr!gd߫ &wnfd%ncyvg*6TF#򣷝}2AICDZ$,>&e‘TYUɜ,!8SHh\GWI<|x蘊'͗X~C{jɐf DM0 C.ZÌJGƺ57= K%8 o2$k,ZcQx2hUƢ*YȩՏoTC4"jDmC79N?G_I|I]k:}}?pE:MeSaz`D߸ ]{ #Azuc.ǠmAJE0`"Ա.(}G_I0SєA^a,=k)挎v_ \ߦng97$WZd0u1wolBgeQ:40˿4;+.eQilƽ֜Hy(Y=Gh4v ƴ322z6Tupл!((tHkRQ1tyFAVU~DG;kPqn~<QzC RtJ{$=|qP=@z+z#ePF6;lE]1("At&];@µ09w(^M˪ '˸Cah=HJs}hF]᧫^ uslUp?{Cڨb:#lAio`-#rrJm܁fzPԴ@s Ŷ3 UU`'k[r3FT굛Df\<"oz*^5.=:m@ "ҬV f܁lEkZ3^Fr&VhJ$nÂ'{|tpҳs$̥C0ɓ@R!~F(L !HH !HH !HH !HH !HH !HH !HH !HH !HH !HH !HH !~"ۭHnǏ.^~8Q׷}fW'w@ӓaUkdqkY}Z{Epa}x6kgȋas |5ҏ6[}<iūAۍ A{iL8dW߇ iX _N*Casf Ƴ}X.zᎯc|V3!|ϼ\چn6m7Z|a&7/?{Ƒ 8edlby\)Y9wyCYꮮ_uWUKy$"kY"3&kBљO[*eٴ$uD9 6 (J~̓tO,{l}*2%?<`skdM$z9g.?ˇ+vb6Nϩ0 2,Ers@vҁL>nL$ `YMf7/AwX,/Ŗa_Bʳzqӳ:q(S_aw5:3qdΗbQA'vXdŠ:z)8 rhFk&B"M:s XqV#q>fzQLB|ֈ:@;z_f=c`֣E0:i̪TUgdguR}vr}>j&Q\1Lֈ3_yHod=?4!҉ ?ܵ5!FҼUfY,P3훵rWQn-,e+tܘ(st]Num㸲j:kE?0BʯJ':A[⨐RSn[9v>%KFxr3ϊA1(72ZXmpE 1zACQ*B+Wev=*ٯ )irBn ަ}$FEu2HmPbw5Iڻ)yNW)0}(pC fe#@2H&'AaaFB ͂x@YxȟUO9k/M0O8`|' b*RYa`i&  9&'eeAΚ͓q4/661|fw@pPA0twC( *90i`"E %CpH%φNRX? }ކ2{(8$0@Qߖ' ?ȞRJ< pS~"/cX|j<tA5Xqr6ȘLGn,qLhܳ?^W'a~9.Ly?§1k\=b\vsl$Ŵ;.UdUWxL'cw}8^fU%/n2"E/üFaD:I)F &'Ϧb =CĘld&E rNͪBI g/7nA՚@IΊ?d.{1^vগkqq;ngs?460O50]yp;UGڍ k]O=pTcf`ʄՔpS5+3h*jc(ư^WU뎊J,A}K"!nW6<V'0?7 e/h4h+#_ ۈt[;Mͺz,?/iL/kjxC4ɢaIVGhp 9;jPd=XU1M;7zbssAQcn_;+=u.t=m'΀DPSIr8"jJޫP=A65**Q=17;jN,&~"=ClkW8^x ;4Ju#*YVQ c(rDQqEžH$1]v{/l'+ľ<_z8 G9:X (3>LX`AAM Yځ%HqY=FE^rHiܸɍt/2roo<.Bu GMX5}PJ1#A=cl"MÎ9CtjEA9s !ILwl`c>N+rS=Yw<*etԞ* 9ba D8%qF(0r5励~2!$ Fj*e;Q!(8v0)&8񊛄N9=F`͑";\N1Nx!B#RQ9"AKQZ65},̆v\ )GY>-aBZeG)5Y^N ۲Zb:kmVΦE&Èrzwg4+'!2~/«hﳍ_}N]ke~}(y*{}>ɲiu|s7g㱃{7 ͫ}].3н@uN(*iLǬU--doNNp!.A?9@bZsc\NJ3׊:&.Dli˸Nз^÷'Ibj$0,V5xkHXrKaBBx0C AQ>0e.fP.ҹΛwO5􉱜hPRENa+ LPcE to+M. `X4_ߴ5U_$%i WsXuPr`]l C89?_[UѲI=%R6%1ה+w*gpn-GPg lߋ|p:ooE?~i~N}~ b]=Scy-y4G;OVUrIT^l*̓2Lz xFXiG/F^ˈD-N$8U |0jve(3yqy3n썜C_O'9i{K&#c&"Q()ToF927OoPx*kM|Xőp.uXE0L*#dDh X jWNU51#d\,:$J[8.d3&2*/-61FfP9  S7'כ>x!yօ҉؎I8=BGr93CXvN{z.=u>&LLr ?l2#,RFXpj;Q߱Kcbv:ż״ιcM_iy4>"[@h^ÚCK4Ȳ_ WwߋeLoMq/[?x29=u^>C˄C hS 3ulB)ҁ;|?CgzbĞ . 2'.bT`V(3xn2 brxxzHCrzt/{pt< Z[ʃQqʢXccM9g1F: lp6HFƟ y9^sXe{>ꍜrqQH'LGLNMϙ{ȿ־&{\^t=:h<[Sf9|$093XZe[DsC" CQ$#nI`+A(H 棈, #+$t qTQ&U#é]w9lVJH?B"덜:%H$:Mɧ}y~PE } 0M`cr9Gn|Ia,Bc6 I9iS/1WX:<]'w<?0L I PQRh y‚&4V┈b)'C<#QHYuG1挵30{-#chnD {#@*R<*.GWZJE|@8x$&Vɽ fGIMm&H dDQ;7 qfAH$*ƔJ{S/r jW(>>tݜJKOR,7U:[&v[0'UG;.:ܛJ0SS#]/U+>* ^irHx@am,1 Bx/d%WHP=)8)m7@mEXsf"(5co׌atao3ٻ6$ ep/nΛ\l:6kK‡l寿!Cb1`[4kzU]CuYA>{u *`ˋM|Y\>uSZb4[EI0=Y Di0)"2v@L˘fBVfe(ƞ'$CR)dK&mǬ3}9k0-Xvoq,Z[ Z{@; of`"YAC\H/Rh.$!OLHP&wL4@2'dEGt M@ CqOvBYQ4{ я);я^G?[.kǖXW!o#4&o[Uz6un|[ZhVK(~l h%STmjػzfXƭQj+5cGgivbmF3 kG_B>EKWFˠ#&B搓E+\ CEO2b0@p!f͌ 44ࢤ!ۓeOpmC*-:yR%BaڃN Iܷ9{V/:j!ۗwvIgf~n./ƿqT! oe6 %;iiɾ$KϏ&%5@͙gϺkQm}Ӽtl~& +qY) bd $YFʥ@wf.  K0ר M L XBBe2N )HzDnť3s=3i|7Gh2˕,H:%cXq:+98Q myDrL/h" ~rz82*)ч\CpVp71JJdXT(GX*t|UNŬl_tӐRǞ49. JiD Ei` 6q?OhýJhNOoS f%Y,`j JE()QL)UdƑqH(eHe0'49;4$uNI~QQ*+1#rQjAJ:PFq<'Rz#g,6=&+j'g6Orp2H}0ȂI dypFT*Ę;91BL,'eVLo:}m6Lpa4 csl9xɿ5dD ) & 0OA8 >U*)8e6⸄Uu R! E xxҪ9d9(Ev3ś)KT['gUJ<#Da+c1j47h H(HV@u׎i%Pތ2pTTޝΤOE$4B=K.[RrfMIXß5o#84EbtNZKNWἚ|hja).+?ePٕ߯aVgΆJu$!{ -"O2x[8NMYJi68́a3.8b'gPfx.nNfsR'xյz #mrci0Ld$:ZQ|$>qS᎗wmWcNMuUZ%ZoVktѫW:j!\1q))'qA4a{LNl-h8/6P0o[O]-ڸcs|mp10ǚKq4JY- 1EQ~^  uͤTLk9smVyCbp<-&tMu_S|>S>z{+D1=m81hGq6$鯣Y,7_sQ[6{Pc?NUjBd_.kfU\ݢꇻTQ%25*8 tc~_N6^H'Ip; "UvWx sDtJR"d42b҉NVaIb;W2\#K'KĀQ%Z1DS^u a8\B_{/$ӝ?|Ν.;!tEBw,3FK9}zoPCm58IZ147} Xiߘ*砨C&zR ϓx`9 `%X#4zVl{BtLQDWBu#8\U?_mZ9ov gZ'f9–־ <|VHqc(\Ax{zdwOWO8:r`|B_2xi(YS$h) t?w5UFj1n;~}zj6 |C Cw^Qr}4{i-AH1{e@~*@2ifLb:&NAa|\묒/s+E?-.RV%&>㸚Ӹa~!a`=ߦEn }#/?ǓmclܸHoȮg{Mڑy{.ŴƼkhoA퀴^i eilҵ_ӂ{T"Itk@jdUNa< 5v=VK$׆##%>1F0,d$ *d0qzR\H|cbY`q.PQwژE oo\op\ uc%Ws>߆7=nwm1@RsS]>]W۫ߝ*j@4: A!&>%>^=xy -l4;i9]-vgֆ%"(sr[t>W V˒<^NƒxEQOfR>%Q7NŇм%td_p¹U쉬j j?}u6dCƨ!/( U;Sd<)g'lu^^bi[Η>g]=|O23VMXϧ7MUGxT .ͺgؼbxNfu}lErFw7yڭy9klo.fG<ַ|Noۥi6x۞۶6KeVKi-~F X9+UZs!fiYa8g۸6!k_}oE9Q!fsrgKzиٸg͵V^dpGNZ 7 q!fTJ !h>P=JZZ!HA)rhZ7)Jbŗy32xx}Eg#=_LwN^ (XnVYUU* Ҏ`y`9;hjbjrҽNP|%|̓S(t^I$r$4"[G'eo1ynKIk3[ sO0{4T9/gf^[`Х%U3լ*z+ $[d:'t颈@hFǠ;|ʗ"ؔ{a-fJTOz"ǒro=2~UAײ"&7_WYJLM<=c7ZM/Υ# ыv\sCi}zK3ꯍ]\~_6_K 9}*%稴h(ڔbJ-bĕ%_I^;?sz-[hZwfA Cvώk~<[`8|Jӕ_KAds~y~S:pW2]~iu*CMB<~ p]iDԘdkjh_nq?*onňO7pӇ˩V9ؤWUs[6U3dkxF{/\H5.ᲷAk;z-jҵo>l_ׇv??n8|oJQJN'.y|Wm$[X,nȏ%>rpTP_:D4I^M}j-YyRZ*Bo.^33܅eow?|9x7~kҏ B0+͹;M&8E|+`MR/hBvӭ7_^fjΏ~З PCrMdAn>zqcaj>.M 1{Zz3P<#fyu&Igv{7}Sڛs?-gg}(:G;Ѱb@?K:L/-9-ak,]M !=߃n]e_i4qײַf'WUqEUf[>1B** $3+2q5߯zjCyV&suSz>G]Ϳ￑c]QIDƆV Tdr1ULNͅNgt"[!LM7wǦk@dQ}{Q-?eov\Y\NδE+khn*t)>IYb{XI97]~xMNuCTu r.F ~ͫ_ctlm !VW]LV&[%nVZ mrM3ԚR!UZ+4[alm՜R-F$\[M&g̘ff`fE e1 DǚE䔔ŻL5#zGŗޣ5t.҆m|_$TĆM;IFO?,2Bo0TL4R*$6;<0Fں{t La-6L?p/I*D'mG^%cڪd*>}Yt*^ؙ$sHŠsHr-꜌} 7:bÒP QkNҩPbeߠ'1|J&*YA-.iz+E(G-qL Ρ&ycE P$dF5(}!+]7=h<{*KH'&9$>(P x:cHpU%᫈;*<`S;xY(Nk"Mp;;Ȳ?:)U%CZ`dD__bM=%xA>hAXP)/сwK }CpB0-S ?L)ڽX1z*DBbFP,f,58- `:(@O"}Q8dW}6h)J`?DAo;k 8\-F_c, °n*) wU2PB@&:!ZZ 4]1VEGY;Iʃ52f&o, R)M۾VTE,S522ߢtEQ$[ |ҽJpao R F@znFVm݃zs]g&ԭ7@7۬4'ER1'vMNJEŬmŰ&1}5Mr4JVK ] z/aOgFo(̈-Xtk8nJxO=] [R|HFDs iѠDT- 0( d#1J[$=|a =5Ze^'ögc'Y|]/7؊bPDRu4U7K,J `Z0p1 8FĹd$ 7vy\\SPFe#1Q@ؘ#DnV<ZĪUiH^#2U( pUr!;tM9y=?4kPfB4}"7[Ek+vg¨sp$k H$*  R3]T@-l0EK zȕ}!5f`J$nÌg9>X2'E=F6[gSIY*7KC&"A:ff h #!S(-]0pE& 0w:okg <Kp!G겞YcՀ|W(iI*ج-6TNùWA|^9W(DxJMg:{#߳nGsh@掍;'zJ~t_ճWJOC %*PJJJJJJJJJJJJJJJJJJJJJJJr@VT`% ((}J Xǝ@_+7@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C ]YsƲ+,\$f_TI\Ib< ̘".TpDJƈ RE׃H D!@B$" H D!@B$" H D}H )]Bu 49H RD}H C E$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!EHiE@@ARt d#J# -"_> @B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H(|ڋ ]ڗEA;+|Z|8? P?pq i2']әa/u0?\T2u7,A[.U*'CP~6=\~}(t4dE,{!d<(/ߙA@й{B9+h0sv|ߛ~b% [>Q<É͍84KӸd'ylqf l]-N+ٳ^`u+/`"<).Gʋb'5UMRHJ\H_1WI ?-uCSd)'N ,}$o ; % ˏJ(el|T;1k˯L,C칓ibmlQpCqAEv kK! A҅I~#kն$1Yeb؁HədyC)8;2KH҉U҉''r)e W8 z QpX*NLJڝݠ8;ISrgB~pQ.f3LcaXڕtB5t&/ B' V: RYc lS ![yu3qZb0|1VJ" v+߰[AO`>|6:~ Œm]U]ʺݰ9,zUjn]մ@:M91s콏2Msz! Mөo^d4@Cz 7J+Yo0hax$.,icW, /"\'ٕ:yrעJ<8|3_M|ISe޹A?π"w޽=]K,_xڗLK1}~;Uwe1Zޮ k{(c[ZtTYZEy54tzn:1aW na5iXO1X2} S kڝ冬-m d=PgR55L`>>wXa|6I24qxL}+2ش$ҍI^53O$sOl ͏/sIծ3vv!1,FZŎY'S'ٛJ,tU!G׈_awF|=~Gڼx !v=Պfz'^6s9[Ԃ0A ez g & Jrdu^niLv+UĸVU[oLCi5ƴB{eշ~ tE!q](\f۵]=MՊ1j )tZiF)dFjmc3@! ˩k5اUC%e$i6\lM<sh"b*R M淚15q-x5I45ԟ0Y~] k+T,M)gnŪy:̎|(aam%Af=lwx[[cPYQfVh n2r©"Huj5ƴV*ASꬖ[8ɫcA bSro?oPJCy2hmay:O <39CiMɌvqYj騍k *bHz&7ƄZs8v,=߫@*|"}\W۶MhӭK]}mK??!ͬgrsL&4B$Zx}isAMX:DYuYKjj&7dTi&j${С|8RI1 \}w,WqÙVw-W{re!Yv'o[ZEUg5[t Ձ̢[Quo_g| V۴Uyw d̶B)JHdMT(ԝ9#KtZ&,4Xzz=Ѱߟ`\^KyK[ȉ@l4q^qZ.&vՑ8=|o9 7g}'qy7;?Nӏ E'nN jK.w؟d{G6Zšu_󸈎ES_γ.,9ڥpmϴ`$N{+SMjZ3 מ?ޢw:JE05h4U9qޡN[s˃®@{kU2ML*.u 3F":_&u9x*? YNF [|;+at+3nd #ZE@ #ŧn2`ֈ/o?%ԯy\Asb {4LdH[߿3NV?f.@x]K#x­U `9\fq(k.]G q[_oh-ݻ`ጧA38[q2L7DX;q /ɯ .q.Ǹ|\ˮFqu8<%> qC%Gk82Hn$MHH\Pͤ1"FŔVjO}mG=Ino2:LZTRAbtiIuE1Ym3wr"_ErPɡ/|h9Zp0rB}HV0"¼j?Vl }udatkh cVt%_ R$Y3}'ʃ,EO5Vg?lMU jNYo_dV/Zx6w=Y|.ͣ Ը6t\{P H٦zbt8C^=abZ7+FY5^xG=~,ʾ^FGmUE$ÕdhSddP*ʌ$)xKIs҄z\*gPȒ1nWw{Nm*L%RPkʄP0yTLPӌj8B%-Ia")Jej#=0~ގ̷hEP힧̯D"dCp kS!m!j&).}W4Y{5#Y.GW*uV"/U&O`$%i„)MdiK[3L\k"ugTs;~q D#F+NG3nt Kgz'dy,Kƣ 25{A;cc9}|~[gDpR]u,t2_ƄE UY}Դ#/D-^ ºrl _\vf31-۷3%>)V4<)b+O­NnIY)4μ ݻ gmLl+x_ O 8r@REZ,3ަ89Gqv&: [ ^Ö[ed'y\xt1OXK[f @M!w:Imu2&Ni.e-95x {KRe]*%Zgr ['6a+Mg^yiҸ0LaL@)㫥!JDz|#) c uҵ)S~ȌJ=S)⁝mZ)I3Uӑ.4DJ6u{q}®_F~KsoL8ࢧD>GW>\X皔ʔISri0.|1"D ";Ek'|'~*@>XuHXr7LgMH."DJ8#koP&e(R,Jf:D Q$?;qGҝ?``bLm68׻?+@ЙteMKIgRUH}jߦ)A)w12P9gsqjX>jb*0^GC2Fku}ml/{HY*L<'jtx p[Ła8D2X#.-o)TSo_ā+9TIp.H{ٴ< EFlT)C=o(e@BZܦv"Zu.틥8u% ɘhh3!H :9 h/v Wa IVKd/:& uk'@|=P-PR x`-&M!i2em}&_ PZ~5nzSźWnJE5!i*raH5`JPnTSZ-rV2ˉT_S(^%6`i<ǑsEt9+>0.([hyg3Y1>&I^-q}^br,"ݽ||0TqOD5Į8cg;"v)jL8'0[p@n#Lń {{N.`\r$4h1k!ܠm!"%(rw%3\)Hfه\1cn(I mL]߽:db70yuj!j)0r0,my\!]WE+}A|677 r&>ҋNaARƆdF nȊcI-Mzݐj Y3*/[?]7HtHh3EY*y}Y"ͨ* If:' MͳLQYllfrٙ(7ZږGA| wä&:_Pqz!Xٝ8=Lqz#[3KbviuQCBD݁14TxGR֓l%se~$dwx^7\у7E.WNFQ|+00a,$GeuV^Ĺ`:ۉv][h?^d|7bv|+5MK>(RdN,0G4fASvK P8+6g!4!U-꤀3P&oQWU٧";&O:e4CLK<WG![z ϋhq6\GUt$q}޹/&Os"+pn%sf;vM6kwte桌*#kDjO_U+y+^Xȳ?{WHcOy89lvs!@^[$my$KӭglɲZR=,֒U*VI,xKGr4*7_b|eY -PTAƜD@8Xa9ɪ*`M婕w \kN@=j1Cg{Ql؞ʭu*@?6ր;.gEgՉ_/{16r6Y;ILٯ֫•"fRG6Ӎp/wB쨕/gMGƎ.{[|)# B &+㑶ayʔH1S9XawqANqdB^hƌ`cmqGAekz<\3/kW}}W\/8ڛr櫛GFnj3lX4 9Zt}gD4oMdGT{yy`^($X $htNA}_>:))sU!a鑪z} JA+` 0Tο/O 8NF+﻾Ca\KQ(+dIa'M! ނѳ?P 5ilvތ+j$xx …bs $;{ Gx.O pN(SZ*儈ޠ'T3-q>LJ%Rj&_ ABCҗBqoyf+ sF!AKfhq€"y?ۯV )VU7.{A)eRj4"%JeFxqpآOfnoubkm5LijrAߪ ృAbgF$nx{+ei>Dו^gbGj33 $ԗ XP$Bf쎻IY>㷦g%> gmM.F`(QdZ+sm6n`9oCw. "PZl1/kӲ\Rs5u*bF-L!-'X"N{]0>Twz\E腂Ow*͜0+^T 07(s@[ڀw"@Їh;KP)$O^BAc$G!LWN49 ^:ƸcӫK2.]|)clQG+gkױ3GӀ};REBrY־u-VI,NP "TAD.յhM|>$=K!YUOt-v#93%q훿)iͬ0И@N)-ל AcWwJT.B8W,rA.R4nNU2vA`xy44iFjK"Gb1N$@ %3)7L4E ; AdJgn sn2PƝBɔ\rȯ)ݩ")rvv@fa,/:Y4|ƒ@;P(YJGɗnﻆ>+s :| g4ٷN9+emwcȠBEBм+w4V /Y'_C{:&Hl\.!^ S-OZf@@\L4THjIv`y4@χ'zL2SѢ n8k>9@F9-(4"c@8cB>$2!Wt{ɥ_f;؏M |ݴ Eԧ C@1v 1$Q.;q4J\0Z|7ht;9GZ5\@; +- Ǿ8ԣGMj`CÄ50WU4RNjR TW]ZݔG NQϦXx:|T Rz]RyaimO9@9 RDŒwEqۈĹВI"y1Ƥۓ'w@dk`kb‘6[fQ_)8{z}=|MI)&(])mfM5cr7ꯉ'[;C"=ƀ}hkn@s6WAs&1I~`BXjק7TNw|s$wD\Vr$no{$*uu:5~Ќ氣ʵ]"~rE}T7P5E^;1=^<|'f\>+;\xUdnHdS+h^Q4n1%|npuPiv?'=2cTX{컂Wa]kAcTN"KiQ0'u7%x"9?Ȼ]ʖoEcT=jpq Xrx͍|FVuN{i+9f0o2/digdy^t<,-?U~]Dn*enV֣@W} H#3g}9!#0i,xߴ\xJfs_f" ns~j!oy>A3Nx)\Mno]5 f?x\hgY J]iLȻ*.+0Og Bk;;9jvrs[|sg?/Ŀf溎#q_kS3d 8cc12h5NޮzhnA;ow&\|u7hRk)n; (htՐ+hBwIÆ Wc9=TaopНdp"AS%E2vn)1A%,AU[ =~6,]k.LVl0ic58;>2*~*sWTXՏ sR.W{W7GQ : `g~Bhr̵Py{"g]7^7k.cbKSn40AA$|{U6]lv0AG;0b4escb铦Uk\ټ- hĶ0>34D@x dnblq,H-spO bB1|hLˣ76AFd&A FaDr):* hh">mIӏ)Gҍ C5H+~$5ԙD81Ϥmr|@!!ssٛOa,) ZEne`nS!Y6bSIdzRF7TgXrZB14PF:#wt[=.1hBhH!&RCY_,֝hSѬ.Td@EbtnI"=)O׌ Ǭl+ m:H7WsW2$UgdDFZL7nOkД%( ֊$ ;@Ȏ)(RgG ) }w}Z0 r;8!מc!KymäZ?Wc֠Q$2h2*S$Fڜ*k`" Q׵C@\NT%ewPg@2+ 6!CT|w`J3$xfZ%KHSB\oΙ)IQPLf& f}Aa QqKj9B lUdpvz#;1+F]2Ai>|Sˆ6Fk*aapu% aP6RwǚWN+aJ!V2/„'ک bwІ\ƌb"z]X Q~ NĖK^h!R'|i|gw=2_ !!d僜cV}!qHNBkăm{$PC!"z`c/ˉar"R ƈ5.0o[:w|NS9+BY e:w0 9*PpBC!ߋ򺹂(@J4kPb}= aFF.X8أ? 2s>C+Rj d֭cT䌩͔ӻjFH_EU޿=<oY;D[Q~)Oӣ4Җ\`V8'@B! 1QZ 蠱iv׊hR0*{; c `FRK^ q KJ`>={sttU' t9n?QCY"xqR\J@ZesĂ@ԇ±&q?^C`$!n KhS r, LUhFilE$:l7u4 ,ښ|iz>Ď7h*9Ee˩owX8E P@EN5moI4AhJV.#z FHش2{n»{y;4DW$i>]%f טXݪe඗cQ]:9z+a,EUS8LiH$NE;ʇFH0Um J:&{ O9"oD9g[Q܌;z=ګyؽ,€f*oTO²BŊ v ㉋pΑVy<ܶvnȥXiia/W5^xxkC_\hz Ccz`1NtK߭up>s ë z9N LXf:L!I^ u*Yg)ε;8ǥEj{C+n8l[{Zn?멞ThI ș(d򂀖uLg|9[}<|huݟBW/˓1& #K}R mnǽE+1H^K_`Ѕ]qquwC.QOi{e ;O11.j Th;*+Jb 6t&2Ѐ9|;]g`K6fNͳ2ȖCylY+{"lqn^OdznaLjSA<:O%R5B>m'ϟzN] ~F`yMI_(thOmḿ?wSx:m1z:$ZJ^ˌDQ]ܹo܋qQrE9^!PJ%s}/c:'3"Ҽ}I*-qkefJ>L=X#&X3.qOE׃A e1qK O;fK0,m{"$kYʥT$rl9Iʎrux.<7~ۗo 5{jV&Wy^\Im'O 9;I> 6JAr V5^xbgLh  dK*aQ,yi)'S(Uo:m>~ N}5݌*KS-k^B۹(;^>atGvǯCAda?b+ ոd˖$pXo'(C Ԁbmy r7IWkfb&G5V?1^yܟ}#)qάג]r+H:ۛR2ղ)SGLҾ]+;K i/Zӡ(FF!7)xюԩsU<}}:V>q drQk2@;Fi Ap:^ Byw[`Pᡨvkj@'9D (/ 1ԣrg"*n  bz5Zz5"coaG@u9:l[T"Gr۟(/+eєJS6Fu@'8"׭+ǪCQDjYJє8JeC- ڡ$pLx*L ו+z-th>iYc>~E| 2<Gd QbvHhg9 B4uk+{>IkSk ]fm>eS2B+:.yȉp~5uMo~$Th 7,8hh(cmƟ3S Uy&EadqWSfӋ,wߝ]TeB_L4oCTPAq_-y[(R^ý(#tCǸHn2m?X&חq`LN02τ+u2%CV qվw:wE?3E3T{H/:*n\:R/֘3Jm;O9>U.͗b!6 $5+/!d`y.UK~#b<DŽp_ mNPJ;3|akiCPxў $Q~(4eOocSYb>n(5juݭ&9OCK)ZZ?P@ a1Y2 e2s} 1Ŏ@3őG =P"I=_VţmTH#_t|[qqkPWzƇs2!yXf;yqQ-]%ΰ:>SxAdnl&j VO1%7OaNe?|1B)mOLL ] !+vP Q̩0 42۾Wk@Fg& 6 !QEQ7Vxʸ[Ag2PEt$3jeƖ([n820"t>tnyy%_m+Z)CH@{7`*c.*S*(aOr\rf19*Ј2/CQA0I:`# T0pBƬOz>.ٱcB!qϲ1}Τ3:I$HQubYl橴 ZJP%Xsq"%01\BFL5Ro0x\bz0D]mY+Ђ{!B)HPc S]r<ϝ ;K;~db؟$j\:omjs%0r 5`h o]ټ)2ytp> uf9\sx'j+5P&Z`lx(-kpϷ)vBl@ ޟ#ˍ4].j,@nZ'3t* %cZ喂5^D9!`>z =qjS>?Б[4pX$0G8Eߗ?A3R䎄̱@fx!2󂀖5޷vWxQszϡŮ>\}㓦6*5|eL!@8iv .K :_%@_ aEyan3 bVUc-r1Ulq[pзn _/~zSUURchy=:sXu 8svelπR_UQ>]"xl嵿+HPMrSXRe}f0в+nz2**@M(nk` ;7!$+l,`"_`'")lًSCzx;(S4;:uUWUсΔֽiVѮ܈misjQ7z\Уv0@Bڌ;h%PV<I )q-eՎ&u~5M*LDk[)d7'nu 5P2|1NT||m\I/IfTt1e<[SleU|3KSa |P0i5hҁܼb:m?Hߎo  !I@љހtTESX)Jr͝0 1k34}ʹ|B1󏗥`K)r ӂ MB9l<0ۚݸtr&N ۫覊 -фi;݁Ќ6f6jIgFBWGL%#sUA)%+nVi){wbHKyeB:!T6*: %'֢Ր/܆ƜY*i7G՘<'fk <Fp#/ RH*ڋVD}v 2W$cVGn_ac:͗ayH(l!#"et$BZAguhQ` tiWIBKWTZ{Vc@H2wwAc=xphzۚ2v`*r!OMֺgtՊ\9Y nkE:¹pP\„a ]`4N"H|Q^*·)eu챵zхJ)䅉he$` Zhv(zN-4NeF ŬAvZ叜`l gB {ǣRj#tE`+<0~( `uÀP1KgR- 6l (T,)# lt,2J {QFˈq1Z2DDiA[̛hke3sV |FXPGY4`8RgP Fh^2b͈\)iCT⌌` 4j/|~#Ss5w$Whf^ ؎>*\ޯ˃4?YTa{`?L?3h {j)\^\7`⋿셾4(-O_-ԟ 0Ewۇ:[IJ,7dpliiM`|x3p O@wJfˇwk(-k< `m4ϖXK7qm> j,iMz84 Ankf`=WTNT헬Ti;72 %lbU6W: 6 T||Z9Qtf?_W#-}s^ k7{W$6`fntV3h3ףto5ΊJ3y(U eo7gU1ctDn v0>5T鿮ϘR`bsf0 tL^pd_"M `O"0q]&n\čq7ʉ8u';f,ljrc 2etjYD.$G{M8;a_c@B<ὺ.{gޒar)&58(1 +k0jC1nGfB֭.nŠVQhgWy$e~m|(̧ѤYR?W}R֎JSu*̒i^hCXjH?Rn4AioE?88Nw~.TaA`ts>|CFn/WRA9kҷts7Mff}\qCg['}W)*Y}X!yDi|'K1X6Omo/US(5 AĒ"Fw!"j2l^c Ckj]'c+J4ܫ Y n7RˌۥKJyiW)/*].%LћEATA0JG3YXQp v)%T˥TgZ,@̿h(U԰sL7E{ʼnR-NaW)#*e]rFXY ]X!{FA` D dъS4cM);$[%TK%~і`5,1whf g)/t8Ӎ4~<sRM3b.Bp#idm* ^*BрvyJb&-d}C2JByW->Oqs =Ղp+2!5YCPd@iO0Fi(/k4AQ!'h$̤-.Ϧpz?Jno hO EGI|")SDL'FؒXrkJ)^MJ c] jMPėit`.8k">C( % -4;ӭ c/^;29sDpBWERI9wЧy}j!'ƩOd^Z$"8\bnh-]ѣQ~t nn9ogNr1j& ͺͩ`FJ 4{rlaO!Al#J[*⭃},E\p! e%p酴N9\5{Pn` ~Z+Inj'GV6/0;;Y-|%e^rׁ%rm6$Ρwx\*GVM0R:%^c.a!@%eHPu2N%C4 0&9 UrhW:&`+OQeU' -4g``s{|>K<>MLEW"[,viзLI|9[pu8/Ӂ"DDQ>b 3N]K.[w%UT= Pzf܊[~iKԢ'64|)-sFC.!h/!h$@K}^iWJS^ on)mvjת%z (r~*/6<8$YAwK9[5NjOL}鏧MqO45*8m>~ C:ns,N2oӡxo,j=MR]^Y:gHuiV46݈[DWvo_m8M}Hz/+dN%DS_/4Y/+e)eeoY +Udb 2N 7|W!ʹ 8Qto* }(-4⣧uy ӱMC6PMNe7JݴF1V,RRYF:׿{>c->K͙*̾ǻE;K~ʔah(aW,SXT1zDï86޿:=B07!ُ=kGͧawG9$Øs9C7mS6vSş&}P\Y޾**;YJԻ+z1.ԐuFiuvZ ]g-4ӯ"X#*&[_fqp!Suz^vD\leiL12 hZQFa6FRY*˝Ӵd\̂*`XA"\dK o%'heإ*2^(,Ч{*L9T&c :@OK~&@%ǠZZ(P*"F,@Hd} Zy}=<4\0s)Hh3S}'T3UdV`V".m% OR?u_Z0!SlvL=t2ƒ(;' JU299 P͵*dKO!Ѓa:p^>0YLeNڵP4Dw0פI}nmެz ùV58n sfHR\T+0:OE#JXK4mNZ`-qݹ Z >dQcs9i-!m9ϵɎ" E[߹RcIK .IOuϲYsRܑ7\\$) s%;BwlrSP!~ShMiBX \avWpP=x; _+nD7f9ޅ{Zͣw^K,8x#hB5_[)H;iy~e]*5 -}U!'tIT/Ԁ<3Vp5"fTXiSHMvp3. 9h^E=eyThڂ>)谬cj}|}!X D0RJ Akc?_3evN5)A1 .j-TVrvS\g؟(:NJhg}v6'Er;? @l>tuGsZ B@X.#1NMpNLgKL^ [ĉUxFaPpTy\[Z :4œ;P@ND0*^XeaOd韛_cٙCnKc1(ՁV\Cgeݔu`vvc4ms:d;@)0P=5ish`iqvIk(/e1Ge8Rts"V) FP4ԏS?u"R-/H8%'W\!a-4Q_I.hUۤTɶ†uCn6h0vȫql6b%:7E654^_MFp,54T\r֛e7(^6!b=) VG(#R%ylw=!0O^  rg$ " 8F_d!0!Drjh܈9/ȆxTP T INyH/!}[;mBC> -dzȣ<+AW> 1b ˹hW]7˨6yJ`lSH.4=Xl ξĥ~b OL|k3Q\Cӝ)VӋ,: ^J=ܼ(C4-/اsԧN1tPr᭥85&ĭiilA\/sV aUD[X]T+xZ*KM{AvG!YLQIy0[Iqㇺ1JS [^CO9.654"ei6Nco;K8ʋHMsCPGX0/Z"{:/4<ʪej[5e`ˠZߢ;?&E; }My˵0OG[AAVkCZGx}N bޭԒGvy)L~@PrRrB7G}D$h; @db = ]\N&s3UdKQ␩XuJ+\p~;+uu֠$$UC5ׄ>C,A;9HCJNI8DL5P.QCa(*Uy  IϥQ01B4C n1 2\VAU[,il?d)5%54^YŹ,ZX+ڵr}e .bLP:d(dUʋ[5M(P"pgӇƥ9iu9hF$ 9nJ 01cS^Z*}mXu9>es^jїT,Y)4B( ۬JR.sc0>T ]rSTPC㙖&4i6 gsSE2 \=-j2LtL&;P%&v;Ȅw<" &$\gqnI {E'q!BĭVtT/I3tkz *4ؕh`ޚ_ߎG~T9g:oE`8n(﹞~н_U9oB;N)(* F+c^+A̓>*FN+4:ZoPBq1C[!\+l^:֗ \|G]"&2&jF{>"chi~ ތ|<`up*zx,\Qǃp/A34fۦ.$|Fk7 ̋@ua@ND0zu7I4XxR HA6Q囻*r5񴰧Qj?4m0zh"R g8"(u6H~LZЍ0<?ȥۋPۢQk1WC:Ul}F|jڊx]F!zv#4+J54^_BIߗ9MoW?`˭ҴC*6!̆%M}e0P{\@ q2qv˞{6k3fCahg +AgXH!o'$yňC90&ɪNoyՎ$fT=Ia_C(zuX\a4\795^y ؾZFQ.vcy((,m ]opYf85oy `ɚFIQ,[Q›os}8.G*+3XɈb*Qነ) &黿|#gXjp.7̈߁xb1_hf׏UݿwvuwxQN//V\pW[^棳#?(,_b4FziTnQ:F8<6h7.6mT?^ `w=:cm0sG3RJTTO ɋ0.S]Z< SZ?}o38BQД]UE2:<*F+еbڵvtt8 -`2jtVtn JkkZLOG08P~%\ Pt2̶CJG ]&*0ӏ~$jKtk`}<&\!p6T >&nS,zrf[&K*oxNK棥7Z0x8#7)oFM ẎadGm4VxV؁B%QXIQ lF=**M]gg$A:D '3 A#hLJ(komH#,DI'uOk ~;}#<G guh.D0Ά>{ Y%_/oGkg]!I<%_ds'oO"5pK.0L$ 'u~  `!O I8&'IPfGf(~'\;@+_Kg rG)1p&#flD)*!@[jm@!V'[=d{0̋ݻ(dƔiJ ٥d\pXQ!{U3 dhgmKvr]]@'}l|]}X<"9~("wtc!rK9,*5h2jmy ,D4Z# D{_ǬD KA+ %xU Ƞ*?w p p06*OY].ߝn&~ε5MAzs# [S>,#Ek1};}˜ rGɯq|X==% % ~C[ yU8_v?6? /Xt-QhZ1pZKnɛ﷛_aݢ9#@]'Ε&kx8䯛oHGi</FNIdqk!|`Sue_T)ˣՖFÌ$)(UmPޭ[jz;܆2/TN$ )p.Mh ("+Vk̿ZP= "aj2L`SXGr%A:=%G@zFghEź6v{ۺ(~ $$uF0\9<"\3"FXxY+ cĆh4 k%8^pOz#d1`Ѩdiq !`ՊǀgWy`6QUu WpoMs"=;Wjzi`XrLuw4O3G~Ud]Fїoz\MGZӇ(F-: e7L^nGJ1ljA $0Kq\RedIQd]H0,Q*8KX#5d7(K*gޤH-RS V08AŁ(I 6@AymVVi!qqL~+nfIǩ?|olcmgi?tYfuEu={ъnm#h0Kˑ5@Hdu*NtTi+wL+Vda."+# J:H"5 $, t#61{#%^LMwC 6fi @!@1كgSQnT_u%[NN|Mg?vv°" (:5PJ9/hL`F 0&E !Iy3j$&m7a6L<\$ΙFK E0 NI;sQK z|,U_΁7%e(%2񔒊0ӂ6YZnDYZ(7:g@/nDI<4+*s#A4%qOYEA]8 wHd]C$SAҎ˓lx |BPDž-D= `[kLD HSyIY>q[VgJu5p7ռBOs!:w·:cjcgV!=1& 6{b +:}aw0-Xvxm%u7p 0AgI cqGE*2J)׃^Q-F /@"qq`bb1}+Q(qgS!0#$%s::ٻ8r$Wmb g1e[ݒbeS*۲`KULF/^0} D+wh{e"`[EI0ݼ:=))n# V4g1沯j`սBrGR 4,r¦Wx[-E }{ ۴"^)/kK9'SL{1',4p#CeunQl&R0gDN SD{ݢB?d Έ=hŻ7PI6u[麀8[]; 7 !VqJ=X##(I6P/:(PvVoQ.̈?(sh[@Q^&pI@$͔x) *BgRH RH΄ FR’x +Ց ŘC/ KeDfk]%!: (|āB04bFTPHhHQ-zW[SuDp=I0aUEiS*Aj¡sӕ1z12 \[aXBWdB)J`}ʠ[4'c ,Je5>#*P6䤓,(7(R*d<7& sf߳* {t,3Wu}˩uQgJ쥻z]䇃 tB:f;"Cxyv&c1w\S[][󹯵t~SZ4u{e{4!$%m` ҄|iB د'm/*f !"=z39n<8NRbWq EݡN^ X:|.'0҅{k'kV(e5tb;nn'>C5N=X IoRjR/(tg͎JQ #ɋD0JpHY= vU~v[$ (yBHٮSW1)b)$T!`Z0TM7WR Al?O>H#U֑fJ,;x`=H|݂̐&?ZԮl uܠ)A_0jJekm51jʶEk{FSKc3TD(tȌK)2omRc&+x6"XaQbԤDNTщG k >fkt,PĞ6UO(G(sЖhvGXГt[q1Pj%ސ8yWaؔY2W;T/̨aNߟ^VpU[bcPd;FqK: {?v5UI=AHukCqo -M*!'廧41W{:-_7S1|t@rR7wGԊu%퀇Vs6 10 p~*9ƒ8~w~PI=Nw:F.s|pU9K8J*RQ9|)-xK8α09F!:cn< K8α,jMx>'Ul_=~#сX8zT/tхC'6ܡOn⵲QaMsJ%:c-?.%P7/~@7eϜ<= 52z|fR&u1qϭ~ūRn] W\1 x (:K]&\ J͎1ndy~Yb sxQ>w,qS~+dJF=D}WI*m H}b0gS8O9GS,kǯ{ع$mxߑ'u4 %r⸜8n7M$8^hDKV>zCtn)q5gi׻Wg{o5;O6%PtlS)SI 9_4(ä9"8@//K4Rt:ψ&zI?iyT\uCNw@#Ue#1+*I'A΅j9a+Ƿh&V;,8[|U`d}J߿?Gߞ͋^֯bXQx?Тk/n1U52crLGhM`I[-mlQM^x}k)>{9X9ϝ@|tܒ8,:J%'لs مw/{+9/vŽ,h¢ع 0`y"zfg{[`XT;r6gOZ07e<:B T*VhƒR51,ZNԸշ'KÄ^ḱ/.g:4ۤ=*N:!pi2(i9M^-}t^UUD0*F%=5smw]Xf`=+RK^4UiѫOؼ9Ig㕋v~HRrL%1-컝^ŹY8@4 n8Y'u",J1w!ck]~wBo>jTFq;\87ܹQ5 ?QfDFŹY[AB]f\53oASlf3H,ъeDRK.0Ka= ^7#Tw]0D*Rr~CDMIԋXLٮ,x-8EB Pm Ex(J/8HCԛHV\8 EkOk vU^yJ,9H7*Ӵ^]+[Aeˮ-PJT3T6^FpZnɺX;?J;`] ]^@A_1=;ދvp I{F^{=ޭXOZUu쾴XSZ;NWەv,[-tPt-Z̉ -tnNs-?O-3A[*N+WbV*l[)hFuGK5SdgǨM<߮kMs=!a!CE=Z95]cSZko>K45)h|Sϱ]Vzߓb %\ /7^`c6lЊX?hZ⥶%):")X$ hK}%Xd  }#pv ȴPuJfz;~jب]`..S;FQtƖ0l"_*ݾ}uчu_ʥaxzRx?iX UũG:}]R_suS8Jۓ~Ыά~}YI?JLlt>gښ8v_Q)'U#@zC򘪤|R)V_-,Ҳ_]W;K"-r3F4 c vjS}0-% >-+TW" O̫&<\Or.:8 w93.UDz9.! ܅FcrW2c^\~wSiu (rE;8Jvrr ᨾ>uGɑ9{p5$ۃ/1`#Ȝ),V2|.C.q+pbOAdKB2]{׀*GZm?V60Pwc^: ڍr-OWIUkOQ2C7;joT鮛G-qavΎq x ڎ{&j2 ?yX1^z7hU>р=.Yժ+Jn'g50؅}K=XnƺbTL` xg`[J1&0fħL)r u8;)c^1 [k:GPu\vGɑr:.6e4+x9吗SJ1)@yjI)Őj`vdz*[sok۲Sjeno ޺, 췝>I" A);MӪݘO{ (`$Ȝ=aFpv>ǭ.ٰ- JF0a*3g'6v* ֊1@he>?]r5ۧ7j:lmr?iwj5i.߻>NQe}~˦fŻu&_/7[z7\k5U7z$NУEz/*k8eWMMh?Gg+rʉjk&2 JXZm [Zk- ĉjvQ!9=Q{m s|/ݕzQ7D fAeΡW[$^skVN"w&<9׮ŏ !`(Wuކ^muކ^mX_'iѦN}tޗA#xGj ͖z-P3:!1(!DX^ `Xǵ{5 $#V)j]'KRi|RUSe!&'.2'V [ֿï!9*0w+Su%aCPT _R JT`rPwВEA jUOII)k1Hko`zl!%Ϥt)^ ;O*Sx uUsQ3^M^$U&\E1zqOJcȐS`ụ/G'Ԩ ~ ; xѻ1/'&&)@}xp᎒s`^3ᄶ_\Ya2Xrϱ184'vZxb A)Xb{=њsO1Ub(1[a՞͂%Z=3B;FtNٙdZkɴ݁GmJ64f,@%T j6VI[SP]ٲ6! D])^>([[ORLDE덝{ܣ0G=`%z=E>]sY-½^@8lugS\nۙS4{$5_=}T_ )2|d$_/Z:iWy9f(sNFz5rY ^Q$ǢFpw %$Ȝ ^MhMR. V(UPv]~p椤qFaNJޓ0Xz=II(Fh0&lC}K% `Qѯ~Tl!) Eb*'D/ۗ.՗ULp{7{˔fO/_bryϘ.uv?;0ݘ21cI8@Pvdd֟oxpQkё+KwxSmz Z7aQSݯ&e/}"ndWw=*U@g{kk2#7/)9V]J~&E`g YB7#G÷/, =߽;%=-ӮoB"bDb/UI> Orcٙ.)`C;"? M vv(b@9aH=mһ1?(=lZLPDO`| iz 1}vvr1Bʺ/f8rptsE98ǂGC/n*JٍƙGU8B5oWnbx004(JȤE9p.k \8}c`}z@;Dc5 k`i,?L'LR]!worFj??; ]!:SJ#r,gĝqL9 z꧇ PdΈgT X&>GE_f;cER g+n^.adp;U6 "/~&l]3RyDxnqfzeϡ,7)-Xڙ5AB9Cg46=J`C"5Rs-S"}QljIؘb^vV?8Fӎ!NPO]70_arrV|3P8Ù)dB S5Q|ֈ)V5j(IW40m揩7So~<81?fϪ;2 89K6e/A׹V| >sgIIA%t65:* yozI1νػ}dxX}Z@1Fc%H=wNXqlNN/n3bˈ^S􍃚k7ʴsW=E5evD=#Ja}9ȭ-ZPEMIG^m1ֈxQu|o/K^vks2g ,Kk{9HIJ3.r~VM/Շ:ۗ_.Xqo-[Hw{oOY\l K}*uvF]nsIخo;ágt9Z5-*4H\@is\~C-㣞n=(G ds7@GHDUHa(YQr{N9 LWUoh?M<{,.IY\)JM~޼(§@d3"{& 7gJoXv+>E}͈lLxrC!&\!(v$'ే +fXCp@o;jv,x8S7`G.4/vgbdrld3oRŰly3^Ųmodܣ$Rvh-K1zb:{\@O!SFصzQ6٪2 ¡ 1 Nq7Sr#mݸё-JOӟ=<ޥOCI[}9fTZJ*[AMt7P̍M-t_$tٖHD(ԘL&(D2fcr)RMX Ē-Y[ݗ zFd-2RJ`.KOY1`P)! ?c 1 B6{CESskJrڀTYy-fH>*JՊU_KW'Iɮ^:0dHεO+H?72\{5$Ʒ1jJЬ_ P !Y5OC2Ъ.d LSUlsr@uNU$Ȍ`?WoKYoj NmhBءp|IH%ToU+an\DY h+t,}R!;dM7>|o~Z8$ﱛfӊmN)*gr},29LX޵t"rh!FPlf' jk}aWFCܳb3+E5@#r9\1Y,A&LeR% CJp5K&|W2Lwb%z ;:Nsu}儮e[<^ֻ+WzASDxiC.DR:pQAJ[L4֐ W:k sv `]-ag&y0ʩ׹?dH5JP/9Jɩl_cjy5bVuRC^j ZǑ#GE ;ַtv^̯_F=JjUafZJE2$?>"ʪx],TxNćC8Co9[n\ؤrL7 9 }Ч }0r6HUz&[UH|j8L;31^1R_ذ*:ȸt2_% O}աQzաQmT.l<ƽ!|xs$|Hg$ÚiY`e7Csi(LyE.-C^A5~uBkykeb5stÏf ~4K9:N7S:f۰( qsndbd|h A澈kj(UlɴЃ,TUB>IQ(8*@UdħE6s}0ݧСs^B(Qu AinX-R.by#JMߣCTuY*qȆFS''F ȔGMx' ?2Jf8e+ qLYzg8%%~͗AZs.g_H؄bU˷Tz2sҿăq DNdH5P]a_GL6ZH%LZ+7'v{*6>~}H% uR yȎt!JML{SRlH ALY\H-^% ӗ-TdHT/ʝFL5JbMV*+Ee5N?1htLGmjT2<`%cؚS.9/'TJƛ9bGɁl-B,ROf`jKa]Lº\1IɹJ1|ZqjtZQm D!ē/}(+TnZ~8=,wNJRB*%O>WJ0CUJTZ`XqIPZY ~wP@ume@'S^gݜGq?~Zf_ZsDS ӛ<@}ѧ.hP/y 7K茡]}Bց;'q 7Ra@iX"az&:J`.*?4o k3KBk]* C.TA͎|.֕,d8T#ܒbdjhUPTctl=,Q^ɟP.2'dE8族? !bAl ?l%9Rp|P+EAUpk TKR3br',!H4P|4RLjnISV6^sBzS u\-V1T|tT]9zQ8qE{ hE7!nXJC(*=T@$v;BȨkZOg/p pigo~хh]`Q;lF6dNcݖO/u,{_o>f j,rC ^eaJMUwPͧb蠠`3T=pV[?B(5 Pp*gA/ UTE%jZju1r%E 1/rT6zdsq|V_⿝~Q픆}~ч5cU>t/>Rrb1D I8as>~ vX5yn=~]@hqLc֞멷P4T RZpUn \(r3J2@+(t%%2Zl|9aVaX>הĩ&ǜ45YH[+$_I)rr(WZQ)A glJa3b+-M*gc \s5JÉz'h(95KAM AWR%:^u쀄fZ\kZ3%?k&:%T2X)Շ5ףTUoz3`% nv!"dyC[8, AYx}ޮyuSN_ {cz.n=畱q=%ۇ&PXcB&CPcNJ6ܤ>>$Gk<1bZS9^c*Ǿc~j }+T1CP?Y8Ī"r|nʟ'V {jsKmkr{{bApu&G>L;qؑT<2Ց?׎|3I%/#yV_.{*ŝК}zPO(} bm6Oȸaf!28rɟX^ bkV]5ףZuT qM2_O=PrQxJ_{ׁYͲw)׾s'QkG6BF*'Jn]6WNr]fs}u:6_Q>l 3"XMW'xKAb'e^̴= jڽN!s1*=05: =,մ[Min5vӮ0K]o^N:%WބVl?Č$5IKbd4u1Q˧[?S1}7hiw7/*=Ѡ"@%dcXeR4@v%4BڢK9XCWВ%CO?}dAeF6AفRU&W@ bIos؜J\带¬ +I$-dcu%0AKi9]y4FҶ8ph|J K&jEӉs^Kt)qMQXI`6jE%=~0.Es3p~\p@.މLg&v'),-<οhI,B?iѢķ?E I:$ʠh) ZjTY4Yg>tdz<.3bq<̷d ,~x_6@oXyӅr{wV~\Mg,>8ᙳ@̈ەo`YqBa:uGv \ޙ1wN?ió9恈 xFjׂ88Zys@+\xvgA t2 rf+'H=*Qم0vJf?$H5ݗq.^xv0Y@`E:;IwI׻ᦇ5M>8})CzHGJN-kMiL}On/Zz@Mv28߿dknoGh<[ GYi(9 />tO}=DֲM]8F*Ȥ!nQFђwM >؛ E$_vnʚ+/߷ZmZlJ$uX=0@HV=u׫E1O1jxV6*8}da*,0-HUi;#d#ahk[ؘ̀"ŌV,zH0RW#Z:#22a2zpѸ<׏Ĝ8aHᱢY+Z$!Ѹo;g6٢N?]E2͓.-l "&z: a6)-m\+5OnFc3X|}?Ay-*GGԻ%@Jgw7>__/f  ߜ ~Z~sOog(W:v0#G:HWIMAZπY ݻg\G-#K@kYY Z­cWy]@h}pH-nIE-RJ:\Q rߔϹ-$狫sLZ?ɟY_OB OC}zK=܏]s S:E^aI4UE/ڳhi۠j:C9})C2$H0Fxrpl;zw{ʌu;`hFwU=m^lf )'̢] d;{#3ctCvX KWA'VA?`/'(FcF҂YT,@O>nKV \EOo=YDA )wTTaAܵ@!]C}@lеL᧮A21N puUR obt(z1\/}$]U&/Zqeg] Ȥ!9&$|3/Po)$b*- :Ɨb|imM4DS +FjUZȨjG0ZqT<DV#(jw|l( lدj'02n6vG5ogBe49g&s)f_# nCr` U47$L4劳e(v|e O'$Zƣ/BpUz1u!FRceKt@2 +\㩔N zW2τv.4(X)>lҪn! Zٻk=7R!:d$8XD⊠&~)b8:!9!ߗOoơiFӯnxwBKnMH\4_|!8Ph<s•^,B_hD^mo[.sVR੗h4Qt2RHC Z[>ܿw+I(XhXU= 2}iQkۚkcSu#ú6F,1]Nq,[8a }Dv~ ZĠE`ްfK4؍ZWdOn`ujz8X}#ޡ@狋="eA^hր2~Z)|ʲ9 Ť($L&2K1qUpRz{:z:*&ଗ{8I)wz19?Wb<&@xAuV\3hF4Fs-ͻ)\q=8ff(m H8`oo>Wbjt{>]sḆپ}: ]2Mi&Ep~t:|ՌC ~)I\k5ٜ0:A&69Lces- ۣ&h ]њtZ(5ud ƸDXBM21#Y6]gك1B(s,)a"]m>n;;f.{| *[vjrd[9'( +&lvr.jlws]L)!PrLqT;D.smr$U97.$$;[DR8iI[=BjK Z娍i֊ǚFbYE6kР@;o>ʿNBE%OՃ;ծ=A-w?rbИ[ }򆮩 &fzq<&oث_7v q\/8q{xw p.!oWha>4k4Ӱ rN,% \0nb``PDy%V]}+O D'3DsY0Ţd馠HGUm*$|^_IHA_w|? Afnxw|#[$Of&b-RK~;2D*ެ""RbSq Ħ@NixF9AY&F+t=J``MRj.=y&uYk> Iw]EcX"ip_l$al""Tq #6-T2"% +&m\؆RF$ ]~qf*o=gpn4}~ vzrB5s'7ڗEUYLcLfePY:6ӭ>y3mۮ R #N ODh۵Fwm0)=0JgpC:uiA=BdG_ SqăA,jTU/ "B1YiPֈ0.K#<\sE_7{WPNxWgxȀBtVز` 5FA)# B>=Y8sb$7ܮ<`7y]O=@W6RGQ{ FG5Y1I;i !n-CFSIɨz#Y}̆j,n !E, ι1\a>Z Jle~P!7vf~eu:^[tnaݦ +8FѪl 1 %nIӈƅZwjDfy"a   `EPf6=TuQ.Rx#EYgE2͋X34^31}I:=Ⱥ܁^MiS:`}B.Di r ~S,޶^f]ﴗ1t@ΆSr)GDig5gܗX_;ug9= !$,Ua+\7 g*ID3j\+K}V+}-[(0D^(K*#We(_Y ޔ=ϥds)\JZJOc)Ņkꐴ!\|{QN d!7=eIpA~М$8..nnO alXv"a~=q3/f{q!ۼX8ahxϵ39(JLF ΩO1ti^9؜εrTC9 qG3TVٜUM\93T^ ÿ{#U;%//b)DTNHJX(yw8@j R6#?">id*IwWfl0ZS ]1m*4FZi=}^f0` 1 "EGMZcV8oFFXZ4nF=(S"zg2dV1T N‚Eۡ3ymI4­v)o% z]Ҩʰ­ ש@7_,ft~֪H JāN*ƁHc/|^#P&Ps$ 'fSKЄ6k'ju>&)mΤHtDHƢT؂_yдLNi|1ɕ>.mर34B"Q[*oM6N"i}ve8~fQ6m2\g '`e=]O5}~@` ~_@MY[z| w9r(d}Q6?>5,Kpb@8F\e9@D(%egcPq6Y6zN׸E7ɨ~;9p>H/'`<=Y-}_D GEK&ϐ@k!&&/*5,9@.\'x?Ԥ1lSPE<SQO-p !:!Y( 6x*,b, ZeV"D. Xx\*i^P~+X >uѠkնl|W+ ,qI1OS`ĪŒppaL` J"6.$]QGK)$z֙( +K؃G{ٹZXj``Ukd"HyÚ3CdK~guJT´;t^XH 8̮jϓK bJߑ y)aЉ6X.d dgS S7}zfΧb}0po7%Mr8&7dN\O"⚪@+3@Ȇ@`"G:M簩ot y$s_D#y`T-+U#̣;XRil>LW_'>$4?QDeaSI<1a2_Aڀ'aZIdW{5~r}^)I>U?f(o} TE\ކ`K95RQJiU"Θ0h|S[8㷙2:v6 `ra*]-Y 8 GX=N1<\箷z}&;J BDg a#hOTZҵ|YaT g뗥V5B7u`ۻZ}$_Kz4 YDc-%K;8X¿cCvqQa͆I& ̎Y-~Ջ6SƯhmC._I3Q6&1`` 68Y7WRIE-k+\7Y`vFI&WZ`rì9oD-k| E,yiLFEtiFas[Z{ӷ5foي)r^bݳIe:J[`0yu6Z|, /vt # ~QfM,pGUzpɶ]m1XۤG!\h[W?XȹA)j0a(_v2#axuv 9-€SN̑?,oԁ*An~|nfRF}e'R|3ہ[l*Z*(Lx׾Zr0P%:?n=*8/ϛV/}*8/5Ȓ&=+«UN=,py~;d["~dvx׆`BuU5E*$e1U7&EJ]Da>~;~e̐HzP11Rպ`rF`SR1qZKP^H(0#PT_b 8Rp*o1)d-+,Υ74`1_{:w%y1-pLk9uv)= Gh=0"_whi]pŁ`&/@0jϙ8q s\P&[sk)$@2 0cJBϡxCW¤sb C|:݋f]xyV+7;3^ΈkJ}j~'L"8y|~IIygFALe8N@>'{ Ë''+: dL^3jalK4b!0Ux˓** 7SB @CKD~~7SC?~w) Ҏ>k˝Yf|-yX򟇵Nr(F9{V0gE˱ʙZ[q߯(,zSꃒ;>PrߥzK켩2*$4p%oxYA i=YɍQ/v;E}W"x&ubǚg}Bh0&ڐfp̬$ԙ5ΤZJaݰk͘nݫ JT +2^Eg3=(V{؈Q/H-+֭c'}vmzҋ"'}3>Pؓv\it`q+7:hߗЗ*hhըmCQ}%σUyΤhB W`N>a˳7!h|=;-U!CdÞ}0Pbp6{vK0La$57gΰmާeB=L&KyIA PkBV u5hIh.| =`g? DS1#gpl3úxHsѹ 61ĉ1l A!CxGs!x[\ ktkj<9FF .0,r,CbBDs"Nr DDH#enf",>(iR[s*&LRtpPf`J4ihR"(]9TbRz01QpwqѬPBB`L]#-ֹ;#LJg<}!}@v_W L\Hohؙ"w*T@p,B8)7N3gL$%YY>ޔ=1m03=F ,dXq^1(J:v(4g 2@ v+BuÚ? J??;yy=Rg) q4Az zrl(d!{1&P_lꃒԇqdeTZ5şYr2xՕzFS%h_]UaŞ `&-4Xķ {gףSVEpwzP~MPb7̲ԨR UvFİCȀJPOy <3km 焁K %c{1%بҪklܝa*)<0"a6LP)!7SRFlsФ S`Zhp.1x:WHUHYz@v}HЪ4{$DQ@"^L  DH \,c^[/@;f+.K?@XmU%NઐH`))v_5T6!⩵kqS]ZS8WRݬvZ鞚+, M$8kSi1<k[^ItҸiac9kH*\lvJPF=B[(S*&L0By{RLjt$Cnu#e2&l ZŧL$s)UMrۄ hkN@PkuZIjH|;MS8ZSԐvϽ?ZO(),6m{de!^I š21uN@LU-,g?$KCVmFZ!v/RZ7VJ;ta7uxy`O7U/I(Bc/ތ> 7otūT"Np3t#q'pՇw;˖>_6S\Ùi4|m?WCszW` M$aM 5K7Pb֯cO2"nDVpX`SyA4ƒ섑U.썡 [`^vv-[ow}w{$n՛=bOfDݻB zc}\N]i$DmYY#':·n /V[)Jq~ZZOFSmˆ 8ܒ3фfkmXKyۇKO?ab#m4DCW>mصqo|F| 5?֭>ΪKB4phR&P0mz-7 iaGL"GL MeHzn2IS$I)7Q)/1ѰBSűZ{6Wyr,rK9~1̍> #&516?)TFygOFग}0B[~ TpH.j0::{w븾tc(qzfU& Ie5|7h?8eHa3ChYWh n1/nH5% *Juλf)ǎ HHy(`s $.ϸx)Ʈ9EfRaL88&1=f͈/^ԆJ &n dm L8rL hIV1&scDڄ+k 9|`(M 5NYa)Ylt%B+ ))_Ni vhFl -釵.8~Ӿ)(yQ#^/#.9UQO(Μn_SRr 2`Q01HAp<7y*20PfL!37s [ɠwnr)ihx0u4N}#VGtip:;4N/8ۊʺnjajjvG.fo~4Y[.b*E Y*N_bdA?l"|IZV:?ݎ>i1<3 %ʼnQE/oxR]`,nd"cOgu-?tGJ6 _aH@l5?OkAҿ-"s_ LtltDvF00,7Ӿ|b-w<9tǧ'gfGfOή^^Ƿ] ~oh wz4x~_|zbŸ2c \m0?\&5ٟ.{ q lv_ΏU°j_?[ |$s0?- zK*C_ǣ+sQ&/{`?7^'`/NY\5 >xֶ@f>|?t|24pTZK Wb_OSSˣ %<lշ_?,px6]L˗7ףA~=|2ax7P"Of0X~]L^ K1=jk򝯾 )U9fk~˚$Vrv^9Yxqֹ)'Y>O,vp$rcVe:x\Q/.׹Wm o('K}X%̨T+ MǓRRVjVvrڙ|_]".+U[m{wRޝ)c8^.[}>Nnyj{bԂG[VG[hKm{Fі<ڒlq͖Ԕx9!3V9f472RRo˖TLZޣ-y-fKF:mx$mxW+D/m/x_EOsUuݻ&dЂ#u|ȅ%\t!~^Ydօk0R#<!e$.XHm9]9ʅl^k.3=G\H6~oхr/_ x>r!aҢ>tNP{p6X5SФScRu/|kÔHq0?EQs^׸%U_)αT{kS͝af\%o>iV;baDmJcFvHnzlZF M:zqQKAڽ]w,14FkI䔬ZYsMTLuUΆb&zyM z jCE{Aj#JqݩWB^Z*̥F9jJZnWGy?dBE)hNn9w >QtKǗE}(z[D +YQ; z0fzaEƘ޸+ &H"թ(ע~QW[:zp*=LS: t0Ǧ՚G?Bh\R#틖*Zh/ZYŠy//]^Sar4ezCSc&H${cu_eaT\uلFDbt_e'(JQ66{x%Bv nOʌbFH8z&O12eT  QSDe&*1^nMuNʼn'ѸqybH#"5]0^1(;*p-8|6֟S)5|>hh;ޛ9CP1"O*v;7r5g85E%i%zv>dMjI'DShŎnlwd ʉ`^jT,S9$΅)d,R?P-eYYCe{k% ~>L#ACl.V8w,y0yo0Hcß Va1Q~,Me`3j%kO Yu>DBa,(:v BC.*w) *4zMdK%Nq1͙"\uYScAQ%i$E 4elіy8˫d{`QRd `TcRNBT+}=͓Ǫ,oEc9l 2ɨ2п:Gs$F+:0-I.6^^!`Kv-/띖|ݫNߓ@aԎq&rzU羯n/緳3XBZ*̀NR7zP/6{;*2&q|q&Mr=y[ˋl\!__ ,؞-> moNS,0T`z|5NA+4kϨ6.Һ:8Q-1\\Iұ=[qI09;?}}__[(w^s,━ߖj!"#/Bu_VH% n!IZa;t4kt͵p\?`B]_X wol1>XFNP,Ƹ˸|){>U-JFkr%z6NYSpD)JW#~Jq~ Va"O3(;3ELRij]|BJhע"f܇v,Q0n2.aSh.¢g&Fk`*zH@@@OwDZ'ҫVF3|(nF[@uX#sՎnksٹӔ ̏twOsO4ӡR((s^#K~}(*?}Gճi!Q9J:N:`&?rrEv*s,:UVTh"ph!Y Bs*eʲ  Ѵ=ݯsD%UqhOdJxA@F{)"Z: Qs+MU=ZL=Qʃda?C5(f=ʝ`9w;WQDHJ˶0ܥ[PG(Pګզ=םxo:yobQ*s`̅>Z^?HiFT;bͥYyg) t| s86;0)%1!sAD:sBRZM2xZ;'Q<5 !Mk缓 'l,a>¡ kdW-t5n`Fv J\-tkH!k3LWJm *$:BKizMxP #՞C.8 j5NU5XxЦo5gHy2 paA1)5 |ve["+T21 j%מ\Zd).Jp; x77~|~s~vf 7 Γll.I֟(wC/>ppfSkC!`+G9/BcmWN8 SCuc0j QHB|j3sԩR2Caync׈id~bHCc=I K0qn$be"$O\~M&򣄓3ycWD4xߔIoB@8H4Ocᙿ@DZ6Z$$7Ϙ6h7]j̥1U !2op\zhA5*,tèW rʵNg# .!e䴪~:Ζy x_"Rc3d)&G :! ̂JDgZieXq{"':&:ò,H9d RY;3P"`@`FR\Ywʿ|o +V&QO'mQ; ;. ȫ'ʪ WabHIDp9aB^7.tZN:/~Wsif-B\/dχn[C*vruP!w1(GN)%!D}9T3 bΩp~Z!& , &ip,|mb@lqK֥8N(-oݛ6Rt=C5 :iQ8<@ b45.+9ny\H@K%j٥byFý;UlQ1"`ݵFr["%X`CRY?dYl$ YDdl9nIU]-~pR/Rw80͌I#JK@b|jQfko$V#\&'ӐRdsGS'5bJE%)G7#en˳.?]Gsmc`y.\bp)n>_;$&]]o󣳪 W Ws"tU Zkn=>X{HYd$/PRlu~ԡm Q0`4!0<5#D{W@5z. v.Q+vy!+nTR1.8c1E ~~,o yB ?Ȉ輥-4 HogO52D%TdaAb$ 8,J(KYtސ%wd'2tX tA@;R GDžulrgCI5Y%\ʓRbtSUI;gvgyo[\Ijj`*GBLԼ(lpAEpi<yJ6xCrd+5uekHEK`dioXLeU{̒tϽUFTrFs0`MQHon~񷶦 ZSQq<4E,LVdOԁHR›!F?B;)9 c`b DrQM :/VT'T7լ>6dpo+\|õAzc[.poh23+U1 AuQ+f9r6A;iʐh({\ܚ^nו IEn]>dic*\~\\=T]m8ѶS 0`iLcҔK@~a: &M,TQbW?̨(x 07aLn_*WqCs,?;|X}<wș 5_o8R:4g\L925UXװwÀ3,DFKxPIj6` IΙ[!6WV&{H!#_fw8Qb-g/gaá=Q:307~wtZml0<6B0|֩nYB5gŠqVAl.M"XFK}P2Rc Gpn>94H_še`./~_\[`RKѨs׋Gkׄoq%(OطsfMGiors&-}ڶNiU,^mDT:IH1F""KiL>g4Ukӡr9h0]H*hnc8*=f7|tq;Q 0(JIVDA1̡0]>HrEO' 7,aq;wBT]Nz\w-Z㷒3aTe߮ w4)Ew) 'lt])?R}#A_p~NԈa:'\%y VW P904ɉYN7Kia$QX .`!8HrTԕsrǠԑF6|¸]zFeݔkLu3nނ|@˺x'n4"h&0F$猲=MDV+TV 6g I&ŝLȦW$V3k1VfԽv@2Xخōyi۵f#*{ص$qȰc%XV`u]OR2eɆlP2fGLc)OJ{K}FZ2^^xhq Ani:>E*/Ӟ0)QIG ˭l>.H!p6=N%׺,A^vXh3)"$$$F[DEB/"Cێ0o20$FZKm!r:Xc(٨3aJ:s?>|m 5`)Q|vtNQ$L̀P e0Y:CL[PZHg{xF mĀH-tUbb@or;SkW@rb`˗91Kb,b-i j`Iy;iEe*m^bQ2WƁ }nxw #DZKJz4(b5c &!,c@S%L̒^sy 0Ə_]Kr`2-& w:qu/Xӧ .G@aSu)cȮѺD0d$dLӰ4SCˎYa];ixQa Æ#oL^ r`rEopn 0aMNϱb;:!12`#XlܹPYlc$uXukX=A{䝜Y2COwfJDoĄ-A5tN j*D -nyi͉g˛R6NE?Ģ>)Lۀ@W_{-l@Y_Xya؊?maVR3,4-9/]Bu xO}Lϸ0,p/tg mt'W15!/v?w~7뗱Z_FDrI!ծ+~Q?ƻ%_o#il_|o<41ˋy'O1&pxeݨOp73z0U\I CB7?~篷@s_߽F_~}*2 y oɸd zb[^U/߽\|G.nS/ot?f:d@St` KdK%ЩhYṶҐv I8jpom:G꽋C /@s`=l.=v3Zۏ%/aQB'U7ѤNa?7nIt^;|57Y/sY6sÜU{Z.mZ};2o'Up_;O?Bq/tZh^SL)O2fj 6D-?dN|R:\9&4|-'.l28(5m#fC֜5@csZvIGd(Jl6BVj8D'pߓρ9 |M7!C"ܜ89`fߗBs?_?_on[7_$ɫ}wUAZ%IЪ=&/qòCL߽}w}Z[dO{je5R!<㩵j(\uv/X(iry2j rwOc0e^} @n3r(](esJlT"-,g%!X\ V" %RϦ7˻պjA]C=>EpDů-(E0|!7f<ڟ^̖{1[po㧟;UrLs=%&1SwPܤX=9PI0*G^/w13?| %%W'e죁sZI7YjـZ&E[r5]*ro/H/Խr,$N>K] -9i$Ey\#9C#5k8`/f^|]U(8hNA!AP3^ypu󶳂S\W1J_}9禚 9SUKXeZܙ=iXz<FV,owO~/c}`}FX#Ȥ0%qT>:0R.$48 d &䡋N -n|?ЇsI?}G\\},vqM\.xB#l2 Ih1zaG\}.nL:Nb28k9rA)pb_ hd%HTQ1cR,Z!EA::*0繲&x}֑OR4#&MYua[ʈdZȒޝ>ڬ%h5;d3agWh10h[t/BYv ~UuNשҌ(}@R4$F34b@s#4;QKhp}8#(;1aW>Uj-iF/ k+-3i~z='0JCu`-+/o"Ww~ RY3Ŷc1_pt58 x kc,xбVjd4e ϼ0Yd.!C"u`mWR͟ֈũ`{5bO#qkq ifsJ4JX㈈, 5bF-޾dd9- ulsIJ`m'2@W*ufF@83ūWRA+s}I\Y6!4szMޯn=`m+́*N$ {YdWfsHp*~wod(z|6팉X8AQ@ Pg![ d,a;Hl 䒟BEq(4Ou3j|]*ixs'v7 4`(.$,ӉHJI1R6:GԺ:0-o}M6 b\tv7蘲GPguPoJp]gIs'YNkR1yTUtxQNe&XDf`eNE2['s3Â6oTIjjڻ*+YbfVfk- 6;=6Qp]*oiO~>˷iR/Mn{Kjգo'9KYǧ1Q%6&T Qq/zʹիJh>0il2GK/]EI |Zw9_pPWWh]x'yvK\C!"n[UQ^x,$m[M(U: l /RE݌v/%̆X,뽛i 54BV[lȻ "H ñFJ4TAO)CB79W VjZg;%Ic(N gY ,sRrJڃ4 'v 8[ANarۇ r "A̶Eg׺; 5en=GxD Svc3 sYe{E]ǞU?_PBM Re}¯]qV0&*lW.>>O{R!H7S_#LT4SJl)eKꄂ|T^(L˷an߲)H?pDAzg6j> 68U)jˀB(%fPWeLG=8=Ĕ==dE!6͟zXcU-Ciq$\|uJMzm0ZC9п7RElٸj\uۉP/`i|4tWgvч_qq]WpH[X 6hEPO'AL"O$ck@T;Dpcm_):Ǡz6JcыWF۴3d!փ5n;ކn`H[܄M?7|-]\bsՍ5RZ~}zq41]v߲ԭ~b"z)f8,Tg4 3g海]2P{E-'aE^UMmaQ@ qŤ91QAouI0Qzf}qV f9EQ̻d`%[9\3HUUlI ))D K4(1`I( (&VT֒_aF̩&96QRqy`@On^% 1?A `|xӯB+7=MׄQvzoQ-Ҽ\w\!G83:r ٧S+~Z}|tJ9KIN݋1GƐ;u+mOUy Q UWqv7^#߇)۩]fx`~2-E"X`% \.Gȹ90͆O%ך>{9!ߖ/U yM5f4NX=[hsnV/>f_OݲF}=c>.=~w'2b}W"I_z%呡AyDX} 2g8,ahU}[MR6yrU MR{{&ary& UNIy+Ɏw8)wݪʠTG]t1l4*3ZԻա!߸V)}wz2(:UQƻuEPɎn-Аo\E)! z|:=OrLԞio2ޤ;bA#Sy~q̮FC{Us*Tju*+LQ+"Wz.7 y8\^R @2+%@E򹌏,ߦu[U 8W@*!jOW^)M`!fY!rk\x2ޡ\u޽4f=ĞōEU0b)Zdu`z)jpxnEAː^y"3ǷiY^gLd.fX2=9u&r`QsXk*Q$eJ)r`5B.W`q2뾔$SUH2]/@":]`!OfPrPY2}BndrTup)g9*/+3=#S>'VxĄY/V`OHs hXL+kk/sI-2I4cX1(tbxoļVKnO?t2/{K|BGFiG8{MÎh x` Y/X#x,>s.q*`a/\DH M`#ZI\`#4RQ)&|t? 9di LZ2p\Z jI:58LKP♵BEC1 ai Ȃsg86A01e/NˣϨQIeսJ$N3qL7iTv L%9)WJ :xn^r)L?/(H OUwlAՀS !T&iIef'_Fҟ&X#Ƈ"}L=cM%%Lt뜱:cxƗ\ SjuK0f-Iϐ100&_>,/˾u^> 5Hx*QiEZ$p9#hBkc,U Hy2$"WHZgNeHzd#_bi4)QƎB,8BNl[96}dHB߭uY,s5vTh 1߸ _v[<nڣ#r%O50^ 2=T:򓍝 3qTb>0'VAKZ!p?үޟE{ϗߚ[}cp_5X9$a.|po&?5%&}гo~_7 >g?R?~F>(̧]>h˖_ُa>7av1N7w돟Mo=m#+?I޼G$k~xY ͦ#^դfġ$NaHWutWUL~|:M'`+D Fz5/A]VCWT1LSs _}O s `hi'S<7 pwv2ͨ{ptvMTȀI_{fӴ r d;hdWw̬[r>*]) v&BXr+XPάa7C~ #,YOכUXM wJqmT^-{ry{Ӄ$kJrӈd}?(>`MXFD1XAqc,kI/_p^,c3q"zءk||.v#8Hq"{k;TnNʎvG 3(\=z)\wna2dNNc/G[h̔ skP7Ҫ@3y"ަ5J-QC mV Fǐau;Ea"`ǣ~Axh{yHñh/x65U9/mU 0.-i]1gaw:t&fi|Ȯl[gexV9Wq0z!0sNnp-w;سԧ{h2[,j6Z!r{_?;wwfby 8@﫥gc91fò5% Gk;A9>'{(~d0ՌVd|vzjnաMrngi[Tj(,V&]V? _* >_8O~ c Y@P-wk[34t]ۻڋϝ}'p~MF$\~L94ץt}1s“>3σ7;ع[7k__ڣxvgÆxGfS zB ySe5i}S~^|槗?>_›Hi7QAPm3~DM䋋~BMˋ}tVcy1C] Y[zoS2A75eN8K9Ŕ!`X 0ipq0,ƁUBL0$È9%Q"h,0OF N"X M"Y2ҫ7A*{}a9&Z^5hݛoЇdO^/-Uf sh)8hGy3v/[` Obb(qT̚ +tj,4%0ɾ[tY7x=9>' Ϭa:k/v/+ub𻝮Fqg2]+EacBO8Bf /nbٜU<4'uc7is͖IȻ4} =րrQKxĖIOrLjmX6gH# Rͷ$MqA vw<͕ϜZZV!!\DdJ{[MQ$B1(#:cTn"T?v &vCBe%d?B*,v"X%5GEZnE*P-@ 0n_vdGt:vu:|یNH!E@7NW~]A@{ 6 LzBcR"NϺ6CWtK塲bAʫLtgԋJ V(e7E#^=R*倕:p@3y]A|umbX@&JvoʗU,}(edheڎe>K=Vi~`囸wۛ 3? yK!&+9;_PXi oX圤upX}uXOkrR~ogΰL,ZbXW E8q:8k1@~}Y ң0{ޱ;@s 0jVMECtbjpPJX̑2DvR*&EBi9d촶SjAt;?ޛP>uPN$- I~jUtja(ֱ2&Kb'ga *X(Qj'T}uJ>9YUpYSƚdX -n70-0jcDa~ _gnZi_VpݤZ3@.Z+vq+&5q񳆕q{X^Zk`>ItDK\A8"nj# H`.V1f$`+ $Q_g z4$92cT)I\y*J -[\6۟OSM!|ꇉFγ|t拗I#f@ӵx3]Ԛ?HX'?|ni_weҊŎq)윚uB"|xߓ.5{7w ufwjs2u}Oo^ o]7y(%yЍb_2X̜oΏ /,[|Y'6ݟba>t&r KL~|:M'`] +D Fz5/Ci04U8Ǡ>8k"z:e:Q7^s[ @h?<Gfn6MWO> (Cx}WGyqE orf5LO>}`'-BĘ+FfT 1c%#3K&zfΤZtCoRobkӅ;BaԊft2u$*AzۛzzsTBS:TmbIg\bXX$ w#xfy<5Z^:Ƴ_F2+Dr{-NqR8I#[JUA9B|/Nĩ*L#{Ih/0 uMcK7-[oݫg]~秖.ĪQUNot`v` Ah8^&ajK%w6JLjDF;C7;Je-6D2+`@re-Bq&@G\jɈcˋSފf*lhX>X6m7"b)9c%(IjXX2I1щSZp "J`#BJe^N_*`ݷ(Wjt[xҭ9QGfp_l>^J9n78( QTK|>ْOUR窾D=D- V5z'2zR:s%eTR"T0*%F^q.A0v!K&ןQ|=VO}Fo 5_cX2mK%0Bӹ?l/J~x)Aq/LiKC3 Nj{fZy.L022Vϊ5>IqZE(}" #0m"LapDai'h:1齵u ,~(P̝@q88gve>ͦ]H  5Я v RR!U"eJ3)dc9B)Iߖ !ŠCׂԼYK]~Ń Cq&ʂi̚bV[+Ã3P14VcI'Uv `ryc oWOFC05_&-?}E1H*i_p.y<ڰCTP[GֆT[9$kڐcn^*R9}Xa BLwDnP*з%%&1u EzF'TZmvfv4\ozԒ%œH/DZwӖrV:Nهohp :߳OqW\ߣsiEGr*#扎iKĉb[N(D"u %¸N!0BvHƦ؄9w8:`Nz /V_{j܍Fn`ctֲ_R >liO maMZ *AQ8X8Wy?'a[*8S>DFsN=^s@κCq;Y^Y~r~)8xd :!zZ<) CD$F-wT쨐(I_9jjKNU^(8r"C]|,2辚~U"GP v*!m<϶H^n3 HCwdl(%"0af1 HPcŘ`u0Q Mrw*ǹ~qBDQ3ʨ01s1dsbtD%cDLesDcW<u8͵ЃK!I"Ng #$,!kM(%@)S&a M(gA:rAN( K&{g4O@8h$2[$wO[#+ VTw ): >1YXsg4pĵ& ~kGfҩ'3ZYΞr'Ţw>ix|趮*\ǹ/a;ֈY-?ڴ~ۼ9X~)a%X/f0-#hizʸ"KJ$&g吅o/60GB&11`Μ _ '2O2ߤ_a~4Xn#&,į/5%Td$ L%?& J2@/;B b:O'f)ή|vK;eaV 1bƽt_"w&Wi>. ش}-Y&C=M%R.ҏ? wIQ<0x̒ c3+z :l޵md"N]? l"MbI^"6ݒ("u)JDX 0lJbS_թssp +/Tz^=5 nQеW=bO/q+)Q;ʜ"rvܫFǸ7:;z ju}(lDW<lOC/ҮQ'B!<( Ik?O ǚQ͘W[`TijrPWgbOQzZS710Xhc3Vr_\T~O Rmu1D<C]1lyTҔrnI~E31߭ 30Pk/vry<9|-p|äx|it;d~s'o( Wٳ䌃ܸZ*p?Ԍ33lWdO| R5I6w"^E9 i stQZ=8ds?+}#ߕs6 )|[K/3 ]%GZP8O':/C+f);Ga[?vh 4kD(i9?c[ʯ^Ydw-C1BXI}1q >YbDwSce &#֭ԣVHD8jEveNΨ]UVr p4[gZ:#Ú#'lbEt*6HItB^rqE{Ng 9BGqtI v7t`Hj#u7_\gOR e 'ֵ)*"cE5@JuMo$LGl#kˌI%g(1(:xjX NxVS4K]Z/T !zS'<u]h\^7u,&jHV<VN׫(udЋLi$ymIztӣBL?.ִє?\3BywPޥfS@'x>G޷93~r8nc7R@d:8Oǻ3s( Bҿ gHRj_cQ:Ym,&CfWdH s~. uGidv{udֻ\r*dZeϞ+iU[%n6w!A6g2.%c~hN22}䎋R٥ e=Ma]t1K9Re4yz+ MdCڥy*Q6rκ X%iӂl dvfY;CCʉ-u[ Xw1Rt69!#GqRe; ,t?bI0&<-DJȅ[EO@W{gRL8^iZ (/^y11CE,cI  !-@Yds̖mED eLҧ4?'faoFQATh#ls @Qφ+qBň 2ts f$C X=\rDN֊D-݇C݇Z93[~>P;Xq]kVqBK>ljNIYQ.ͺYe"-k[z(>#L#2" A Hg2!#B1f3/À"Ϳ{{rܿNZ a4nσ!r.x-h#<|a̼+q1{wB%=ԥ39s#G/k,3jҜL?Cه.+Ұ;Qеj. !Ke"ew4jZ yMFڂ9t|Ǽ?H1&B84Iҁ UQi'ۑM]sMfN:jUJ`3Nb@eirܷ )LiCGR}P[K رDOi泯LqF\]t>[P+’\P k}[uu(G)aB!iC $bDMCHnNL!k{_OuU~?ς^A0X굊K9#| % ._%"L\a̰J/-'fik^DHka0QTwvtDk[.B n#&aAnrWLЯa;u"bN 37?ǛIqruٙwqRN92e:I h̊lճJׯg^'ֆ7'LCuV3@5Z$_w?d ~ՓRv! %BEԟyP~I&"DTi"N3Q \ԸRshq{S[)৺BiSQW4SZSp,DE}gxZatc`* !dtH XKH!B$956$,Yv06&(]SS OE`9 GG( YFXXLT0%@əDQE1[Y1jHo'MtL|)Ugl3##]<+`);FmCj3nk7)K\ZLs2Ie8MƦ"hK,?K6vsDB/ϜW /INmkXp /2Ae!؇,Z螮~ I8s^4+Oh)Ꝕ\j23ɊNlR]&ZH<x pb|+:6h й.7֑*!Mw 1D_lEd=¼~ؼi磝FyR >?đ{}  c) Y9;Qw<6`R W4QBX$",@P-7_`#ZdzWp߳L7hxo&}Dڨ{~pnŕ; xt LSwqK}Nԏ55M/ϖЀR>2B(R1E6c/#7xIq[ jU\*Oall%T/Uӱbg.ݽ=PMFE3 Vt/Cm'cUfg qbkPP5(Laq)Ԅ2;A~+A%Lڝ`ξ^[2zxDDiq ˟-bbLQy23TM } N .j5a`Q6>l ~/^Otî$NOe|ʭbvLk(_vy μ 77%QoBW{]ԗb@ޞg'�u+M^z^ix&Nٯ -_Zcƕzx7̨Ha7G]c;%@h->٠vPs_RΡ33fhw>PI@  cB a[}qPh4OD|!h0dBHaJx<j>ևށPc@PS|Iv7]4ɑaQVM:sF"1?8"jw'  "TQbRb^'O]mI9\+{3jUQR(^9}g4^^>"KNPvӾ{ݢ?gh/931pڅݹXH {wBTU1,¸F\'t/*ʒoTyvy4|:ѕZ{]ߝ^'$zCelW8kJj.%&bO[;Q o>ϷW]#emqSnz?.{V2?\D;}jbYbbs,E73fܸ.~VmH A5وJALR:'Tmx[_lLG#[EIa D $ɋB$ - i%9C "U6|g)admm-}1wR<>`I"4{zHI$KZ??\}@0Iv]5rx16Oyeͥ#q>f F9O)X86?(VFsS8dx@C5ij=I[YЃ{T\WfM(mo/l^pAFqkWF˱}{awk.ogˍ?|AY93/6{y%ĉ&+Vc9ʋXݧt>)#qnהjGIvI?u9& Nq(ñ^/ &*h/.x .Ӈd1'ڶ;ucDTԆj)d7 ?lḟcTo|f(= kiSx r}HyT%f(pgafdo=ؙ9n͕qMc0w\+K[ҀiUU7cyf/OnAriC2Pc GvJsk̦p:m;\X Fdc%Qm$9Ψ@"_vhiFT3 V;e/'2Xz|{yyQf5c si>_U2W[lȳ堥e`e-oahКh}}煮|0]Zxʼy;*jn^ ?Op3h`Jv FipqlYLŃ1)R.C;Z2"k2kLbdlc-Β{2HmRnw#iK2rg0Fu Vu׮'fO~G*Dلh*#][y"f Y,\٩pdB;Q*CW7/l{7 m|W=߬oKEpOv)vbiH") R#!Ht{Hc\vm4M"l '4%ZdV$T'~送2N9͐x?O8' ):Ux9,(Ƌ_;lc@^Y4~D_^pux7(EQ0HOϊ-8CrkdrDBᜤyr{L`Y?$4,S 1'c,zK Z#1!Ӥ}УżsKREq1y(aEW{֯T\f"0B,SPүLp[22.RS ۖdL#u85 @E&8eq@"uה kH >d 2A{E¨M%h*P8¡{*/]Zq|"WS8ǧVq=HX"K^v?9x ѕT#u#!vuًC6KU|L L$vjwߞ3P-ӧbsa+ǼKҾIۍ\...ނ#^ o44E2At@ $@p#S2# 4S>T7!L0YNWy ;Q+m%pejw-/^2e5DSRb-fT4繖9<揱CF_W1k>\mRmEmUoUx\6PZKOԳpLgzp1tO M'~7߼e|3SJIlg喡s|GƀLk?m3e׹aU[ڰ k33[0rJ=jIi]J{*U3F/jPVmiW>:%zKx{sNWw.[߮8,W^?pz4BS9@s X:cw3S8I9O@H81ۤt1S ZV4DjNJ !Ji7K &0WuU=9>WKHye&Dr_y4Vf JO~wTKGm|@UeGQNT9H|3C‘‘3腌 ս)؈~VuM?aX2N2O f{'C!bp7Kč4$k"- !!DŽ*amdb0&hK/xꉼ9^X-mϻ)Fa@N'RMK\O VbgQ#zһ:!#Nz_z8pmӌPoFs;ҨѸLqi Y R&H>@r ſLS5ExDQ5,< rQYrfnoџ3ŗ 2ֽZxkk>ScK{KjF:JG!L&Cv_< ]D7qE# AHRMxuy㹬A]~pqgiݮU;$۟:׋D]v*k|4ܞ[^#x <ն^\Z$ \j"(,Q Ù)/RvPujC&⃻1Bl%tmܰ_zn嫂"(:f杤bQ,#4Z{>;XAįqhuP5!vpƺjqDLb<|:E)a,l}X,xoXmwÃ$=Pl@ǹk@̎Ac$0P^)`G~c!,ӟ6ŚZ.xB'\f^& +*+s%K>4%D"aŲ"aL#sQIh/˰0ŗU8IH<fRg# 4Z5DkCQl@/rz},T`XJmB$rӌc* R3B2 M #[F0dXXi]R{XiBD'H nT Qh". I D3 Ό hS+%s]@LZoKM^ ^j @qj^bߛ ZY+;!sUӨ"]_W/m`Pv#;el>gzc>̪N/|mk<_[߁ڭևeŭެ<:νǒ祖0kL<> ;[>8ǍU=pAFq:Z=*Tzo{fީ:ϋ;y;o%vYL4ʂD#*EefYj)7Nq.h*H ҌCR^\}픟\er5C,df8(͆|`}17JI_ɐȒb@r3 2FbYqRe0R%a-c_s].N.2EEm/d0-1q2+ eǹ R O@9&6Q+F 8-)&J܄8dLya&Y#M4k]LJk[>|)[Ri~L ҽZ>=m`KyM~|z*ecBWO k71Ҷ_<6B=gb @Iw~&Z*!C-L??ߙ]s6WTuU\]m2lvXXIG%KDYR[_@w0~q̟0͐/=j 0MIa(ŸsƥwzLu‘/RX.e+3Z]0BuU8 pq01ś *Lpk $J3Fq 0N,b"D8,/&dA HhI\"pJ[Nq .1mD7eqH2D!k&⺱̢E*SrsOtJ"7yc)Pw;Q T1ڻ_c[9{|v3k:.Dx~&k/>bcx}"[8f H\hLԼWfWfï}>QFiiOmjWVStm:;Ξi$TQXj J[MK`5!J)Y)M_I=և*[5c"_1ꝽgT ;;PN%"2x2IzCfj Bn}s-fIlA1YĥsAg7vBfbi™ 跣 hhT^srV; ~}О~h]$iQK,2xʍK C+ ƃ*C),xg9 RoLwqBm6 JS.4E7^}kATWJiEa&{et fN0}R6|,eoBTħݐtf L~W]jYg —2YJdMjZS׊JoS; PZ0٭Y@ ( vz#=y$<qV{"Fm7Dg6Iy 60 P=wX:fCH+^R e*6Xؼ9L#J/ 5b]H?|oR(GTNϣ?o1l/'d1靖=yBքU:y>.6G=M#U>QƉ"y08ۀ9BkA'<0fy4tبstCػ?MFj+Jvl8hdKĤ[j5}㻾Jw4"@1J_ڽ3{7Y%X?a _De/͛z{tax}u:F|'$R6͈QHt+WJ]-%u~u*Iz5 SH{rw$>7 Mb0ڶqu3l]E#:NQ2,vrGrAHbD#8g~$ЉĚQowٸ b,;]_/E.Ϟi$:]nԨ+izoeI SmRٛ!q6 qahrM ޫQ2Ucr[49B q{Gh~Tq}N#kIr`, ͮ5#b_NCpBQC5GK&J+C4l|'uD=|@K!udh[[Axw6fp5Ƕz\j)i;5a#;zQFU%SwzufJk=|!<\|8@DS~y!"(DHS,]k`329?2 ;4cBmOC_%rbּzmx?|$OKjHS:I{،? HcJTc՞gبDIp7OPʆ5z- ? X>>ܬAſoY=v&%!E1ݣc.n,Qj`5{^/3R&LzE~ZrHBqm%StOz`'G؀~*1?O8vbB!NR]v^"\9M;v]v&R"SH*v%SH7.-dp^ѭzOV4W94[iا(be= z˪ {̪)6,DjxֈR,+P Jv $^NFC?3G~n z6۪m(u yAF.9z-sk-Q[f6G/)Q"x@/%>BJYAہx ia e^?lzk~ ]u! ogݤ½yy~kǼaīz8EBf.8cQ6eYB4j-ɰ<8O˻4`o`' q ףQ lvoy/j2ȉBzJϻFQU_|8)`KbhʹI@3duSD!JR_D?s}'1y%JHU90 $IxOQPIֵ;<FuVhNgwl1]}AZv&\*p +e_0&+>Pz͜M8Q̩Nٿi BM+" =IoZb~4WT%Vʄ 'm#Qy.9PG!c H=a. 8ŅJVY y3m3e YQ cRZ,=8RyA>&'%?*{z`L+RTI]j )ɉ>5@|j ǛA aZkVH(SZ#(5e5Hz+$Q^RjFULi,w?۽ܛ~4 's7׻E$'iui8_j=1X\E2EBax|0̰FMѰHNb@zDU.NxSЍTj*`\<Ҏb6UPЅ:PX\p)dLrQ([#-݌.f/鼡h+*HL/w6b8_yҭla!&ŭ2b1Q&X kKLyi]Ԑzk0N P@O[=`H 3)xF$O'&`ʟLΤg6ITEBxJHgp,cR0t5*HQmTN9 bAŀ}GqoU+=8sn [vנ@$IH74ά*I\ /JVHa8ϸD(c‡rfXXB .~jK%RK#aMrg3҄pj c11(k V2 |[]DBTB>ܓ6"98: B2 5{2tr3ir:\`AA\r8 akS1 [L)/tPZϿi0ӳ?9 qNzD[zl[%ga[p3spBrH@q%mcxZ njYOFIR=>1NDNxhc6/q'mH/r߇l%qŁt$R&U)i$R{.YBLw]]U]UTH[?kf8F`=6ru Ϫ%^KZkiPhlZDUnp1:L)\my1Ƴ1@PJ:G$)7R x#i_ܹ} @u\vs$.BQy=ԇ)TXBqc+ۓ2FX`{HTg$X:SF;"y=Kr!:#prZj:%51u#1V$'LFrglvZ[I1%9!ޮ9\2k1YN>WqLZ$8I+!cJ:?|\ב";ds%i!T!$EeQ8 >R-#0Dyf Iq sB-Jt{!%~\j7sò5e|7ӻsW>tvy:߁V#|?񿚳Kx2[?|'MܪiVRMR&6_;_sTiZ"G'stiZMz8+0FϵQ95sIQj~rwz'v< }L&LLń4pIV;wvуRŤMEYd\nRBym*4{7^Uo@sWtj-k^l'[>׹ @ qrr Y;J*ccgg9O 9Frx$b̾jPɨ$o  =tϿҭ)b{/ß~2ջ-&Sr܉Fov2X%r6gXs'>{pJ/]+o塱/݀^;ҽGJ~Sy|K%!PL=GnudJfGדj;lj~ɡኚ䫐ja 0ZVH$'m0%cO@mu@B8; .wX8*AWz.28p u*!jnI6Ԥ-ӴdɁ\qq5%㢤Bպ-J"C%yUC߭RZ}]2BfV+RHxG(!Ga0@ o!T)L,92"f\zx+u:@]MSr%jwߢ>L lbfI@),Ӡ 4x-ν 1F1}aL*ZL#fJ~KF4 Ӥ%CZEEՏ:DP6.@$0b :2 qx(Ah6TjwkDhƇzqM4u6ZE-9Zs;` ['F)K BԢ DL!O< R7Y) >Pý bȜ(@Ւ893r*bX7Z('ԔSq9(1BD/E)B"-K8 5cs[:KzTc:]%>vlh5v[N2{/n66߇͖Es7a೛]tN)xinz+ӥs ,,ʯX,<7MѪ4PWv 4AɻӰrf$䉋hLq"o7BXA Etr稣"OLe#ݚ'.Y2Ňoj7[. RD'w:ڭ=Tδ[6ڭ y"%S҃Z-bb5Sh4Z-cOrh^w'M]K!u۠cZ< 5a ңK5;GO? ?\J!gXTpoq[ Hб`F!8[%Hњ tHLzӀtpo*u80iK46DJ{Ck% 9D")%w:U<3IkBBfɔ%nZ»ϥAҡ9j;P2و&n@Bf87$q}n4H2szEHJ'd߭ y"L$\l:Y5ZzNDqzuLˋl~{r(_Mf,߿W}ȧȆO>SFhy4I-E>a&LO>SF؅ND^b ~掃fAo_νlN" W,.k^}Z{ҌE+~{?U l gnS==C bӻd C1$v%aӷʫrgeo´qUTG{噉9?栣ڙo}`~sW|OsPR{;?6)8x`%Bck$PNs[zr{'*H{ 1%# ǹ0p yJFZer{s`-^g6|4F]`<`Xq^w<B7GٽGd("ݮ/uIT$SItX)~і:yd_?=d6m@aRnǫQ_)J2v19~}AE9+ my /D _(v ~˫Q1/+:*sf֝vSߙt G%MXn~b%Y?yC1D09CVM$^n;'ъ]F[kOohe)J֖mU]1zwa`]\~pFv:x+q~l>Mp2{1_cePĵY;V1b(k`s|SƧqy].Hwr`fWÇ/dƨѿi7 32?X>kym7*l/~TJdzJd }r)"DrB7ab%35Cg=T}Vgx*qGge̙J=QOaM0Cz!j;Xg6NtK6dF+C݆W#_I\G]=*}#4*wy'bq8WrK?=$ޓzOw=Ofc Bli$sByO--@h#V+cE>v[Mɂ#htϛIor':OfsQqc}u>=u:ٓ-k#yӠ[l&̘3Zn%Lvv遊O2>A=2 ~$lY DakMZI>[  TWr=(M\[߯.\ٜJ=!l|#wp Ƅz$v6]%SL!PHVxU̇( N$mlT5%(iQm #({,{7^3镕8V$Zh%NV<S'L3Ox!9G`dž#VsoKKruy_HKN<͝tDaƇ!]شF[0r╣.8XLؠpEAº{LK=hlj~ɠ!%ō#F`"`mǒg&'J,irN"byUSxa W%,ܖ$=$E̒"c$35ګ^d}a5Hu5AJ Q¶3|n_ ƑjՕmNZ$Gee8,X!ңm9,c_wV gil%ȸ=~J%hQ`Ob#cb :Փ~@OтDNȮjE:WDON+IƇ}H(ڵ}BD`( CI?mN`V`Xȏ:RYupg'8pZl&8ϧg@׈tJJ0qO2愓1nvHpf J|O 4}9lb~ !}E sLi]P{ iGybbN8O'Yd#}DvkE~!ƤI`9cl1n+e쇏6oC-={sv,RHv|^p|vuKRsF>fILASF>Z%5nȷNYNkwn8'UmkmF/]l,>Ar=1YI-nɒVx&RYUXEcucW_'ƨuNCid!NW$[(t㷐uLje'!s{Oœ\I8]ycZGw])drp_Sw>.[xKxBVy6o:kx2,yE"E͋h3XRҵQ{t)SgIt4) z=9h0*gWKWCJ3+ڡD\;x+wS"xDr4u)2eMF:p-7> Hq!,ssH.#}0TFooXͅfª!i-q9sUF낁ȸ->c9ɑcM 0>Q:ѺVqxv;1 B# |7@bAisQ:Ɇ* u\ 獊"#iy۠$> 股͎`d1׉C"NZ Rr1)Ȅ5iB=vW_cU1Rj&رlgu@qmUQa9:;? ,@@Yb\HmBDLdqyzHd I|cAJ[n?YcOf[NK8:g#(@K4}%@jT7{L=b*@J6$1$YJHo58-kP'giM7nNVs[x .VG8O}3ӷbJ cؓPmiڍ0 $9zK(6c釫O]sȟ%Tm>740V -}X 'ݔ~)k _%&tF??^ncE2FO*Td+5T2qY#53'=`'ɤmj*mM?V& r¸VdqMn˯>i#\Ҏz{ 9tIpJa`![Lȕ'蘍1-N2Y7ޑ-{zA@7{0rrqP8ƅ^i Α ׳Dq)O%˰ʪ %RqJ#7oW6ƻͱ4\W驶~`ԇ<4=ބy TzYi9\M"飿co rrJm|w9yBoum>"?vv:.T/={"4?jm>%ځd YpgCൈf1(4ȒP}!.WSzDd+6'|u2Zbm:VoXJBvR Z \uDMWa ApZ康ꃨ y*H3NltAZFbnl 6i 6$"9<%Xu`,$,`y6uyÊύA(a6K>Cz ӊƝҤJh%Lxs|<߮R;Sk[$Y"3E vFQhx?|Op)>& \ÇG64=i4{nJU"zLªzB7뛻\/r~0[.^|^}v~Du.5U0?}F]LnԴ nDKR@')^_ a}`/nY{L<1rB( [41 +r`&\MT @d|E>Pa! zD*9 qcUͮW?4Vuk2hC$c7L45KulcSpf0}\#Q[GS @̲2[i,\IɖA,N?H+SKSd "֍'?ttB e ;K'- km>Ɓ./Irb]ɡpiTC6vy^^2}zv J]Bw# 1"꧝ {!uTLJ_:zPW|b!k]BV Ď kk~9+;k؃DHVM%ݴoP a pj[,16Lp (P=wʘ_w&~A ] 7;+V}+]T p{!4iXYC%N7h+&[G- clGuHV2U2G{ʣxPsʛx#^xI̘2>$#k]ɥy'xdD23{՘jGn޻ژo߭ d˜kNrp]>-Y:%E%(F6)TP#dW>AFE2>G>_ 9z5WJؓ$b7*j4Fɪ )>ї"%=dM'ַ#vn Y9.gGƘhQs}`P&ʫu8+HN[ThE ^JJkWQ(^ACAIV}M>;FLU|7 ʱBݮMWT4˕Nr+tnܐ;\8 ̟^E8Iå$s2'QCGIX:IBpXcgpBy'=>Ox;0R<=JݨOIFtpMc!9DZ;t8cCpl%%HWVT1NRu$4)^xsҨ_I:tyxk8$* dUOC5c^ĭU􄊸.󂊸uw]ĭpNnR:2;WXj׵-Ț{<]~dNf>deT횰nQ\nZM M.׮`zp5JNm|uW.YKjR3RG vCAF`(3A@P!06  Dka#oY)ue/yuy? ey}eYq"l"?*l$?`>sm+3/$0/!0#SUfB-$Q2~j$\CFqY0A)T%0o<\Lk4*לZAO}'+\J|/dwg|vvgZ,j0?{d)^'>m350jA{W9{5lqPc50jUBRcQ,t12̪K^/5~̾skڛG EE}Y[G +%D7(fB!c.:%U:j?t,[[5ܬLǹ/G 3/gMɗc{rcq15GoVz?6,;3֩U$J'= 5;jbJEgC W/5 ʺ-l悶B0nʙ^ݣXjb/VZ5Zq.#B>y;{|fI=x{ ؒ?F_}S`'ykZC&Hmk?;q2 #cah'mGz S7G!Wy-OkȻѐwC.lUmpKۢA9 |vrvu <1>6q~Nζ=;:1=)6vCvx@ݷ<U1.lѡ-m0 9V^qt.ew%/'>;T!b~3`dm&{oˣs\>; 7m3 +i:`lQ\1 }TH.['myzҍn IнRԪE/}SrǹT5G'kWrwvx QUͥx9~~VX(|o~8?z>(l?A&s[ȩeIO'8rNJTCrСسSb#VbBL[>^b$!c f,O+v/t8-P> J:'vbgF#Fgke` aB|vT93uQ!<"j#|o3bQ3zER-nK̲lg /ڞY Ny=[&(Xu\Kd@fu!fL?3V~@ FgƎ|nJ@:^еCz6COlD=IUP5Ȳ-}c]ǘш.nڙ#sDw{T:M⪶zw?#[Y:Fظ:dR3$452T%gVÄ8@w:\U/aKo+G?H> ݾ\}Q蚽s(=h\Ci6=\ȟ(\/)[ F7`] ܢ|{n- ᾋ&BX٢0(ލ E,}Z C$C޺Z @܏ZFHGR?EBAy7PU(x;, JEfOb|Š[g^oo8_u%MИLa 11q&%J uq#wȆ?6rT^#Vzh\90ԈգXo6 !#Ui98H݈쎑*Dڈmt ?6Zy6b61y4(|miUo;Doe7?. |(g"Z rꙓ}sv*v h҈͒GɄ0,-w6-(*?%+ف 1+!'dүg5fGGQvrĄ溬pČ[ oD̎#G^c0&4lM8RB¤z+AWDPӂ~h!Oک5=y(=doQ ɪ+ j1~XDuUnړ!ts"AVTGOKSM)w)ژḰSZ*er@wnfSgSv (+8͘BtqQڸ=e>I@1ꔾ٘z,8:D +ihv4e탼ؾ)A ZO~NZ 1ا ^sI׼T$݄NTi3ta'N~>ɿm*!'z?Z!y%t %@};?|]8^O+Xz_i׿sؾ2kvf+i< ᓑ dT;V6iPR1^:2Rf >*FFe3ASժFN^ൌ(4 )ZY>qG& =#؂ 1)k$=׆:vZ=IւL$q Wch$)Xs![l`xFi6sAc!ϕQʣ@;Ej$Fd@Z$uzZ%ʬ)hPVQ(|:D+JP8+h<.2H+cVEУpmZ԰^Y% u+ 턳4|6r=*2x,)hW?L+"ijg}3S8)]+"<1r5pK2䶱y5流8[|l>T>lHy߳P`p[ οaD?w^MnЪ^m)=KQ|q^rn$䅋hLi=z+SaǠr1h":mnynh[@j.$䅋hL{ڍ3> -&F6\$v*vhSօp͑)nP9햋Amv0brޛvhݺ.Y2U++6(vZn46hw6Kk6n]H ,2leMvŠF vKk6n]H ,RlԼ[.MD'6mL TwhSօp͒)cF}$L31hdg3. 9qW!{=$䅋hL n1h\ NnmېRαnmݺ.˔c˛f ='+qz)JzܞLo?,~< W rG^x+:r( ,*ct)hG)BLL)uHHTA:.*"~W)!$)\p>$ Fq읔> ,JDqJ,+:Jk@|!\DDQp^jv)Mԑ$1%Jْ(c *VǍ#,S!ha hyB eP(8 ,8`*D"V EC+f2`"[` {M*7)E#I^ƞqu`lx$wb+e_ `1Yd7%H$}TU%qΙ# SjAD`n UIJ;ZrpWdОFD4`0|ayq`8 *c"3tl7Dxf6.Ûn}L YKtCQ?:zv.n'ϧ} yQxgSb,&stUVDnP/$oWu '~@277.ɊcK.Ħ|9?B<:|Po܎Fp}ྱPØ26wwn.1u;_]+gYׇhņ>rS7_S͈a:woX?FX : Oᔆ.S\q"3-Ϙ`;HЙ "\gIkjAŴLK}+{MFa.y=L'] j^lxIۢ'cV{u>$hK'S|8)RVo=&9+tk=ou^BB&'beהܡScr;z?V>oO K!hӚԬ_˩TRICba2utcC`9brPb$&;qشq;Q,Y!F3Wϟ:aftޟ~q1ydVKXn,_>I^&v$ X|2ݘZn¥di7OMKmhd: ?L;Тˤt٧d?uws$W(0(.ZIe i6>Q?v[e|?Zp"Yd!\!FA0b1XDDz$ħ$MG3yT= 00G5kcTh/1fv6nܧw1&`8A4 CoX\{CL)J*]ѐ f&U!Y((ܡTp7RFɌ(qҒ$ ,qGHlt0)!h%IK&N!k )=iSCڤ:4Q?ŬY AxGQ*>`Ҩ(}MܦTV8; u" 5QjYXc2е-ׂmdXT6P;xOqIMSssN9{eI*Yvw&f:K3y5Z`gnvH\.ػy@Qj 4XXj <7&h,U[bBkP+=3b0Yi='aU$UYH<ĥ?13,QƇI܄MMd)N+Gd!y٢{U 4tƷS'gg\Iήί}3ݤ}E7 nGog'~6LEШ {D^[]Ie.+:{Yh r 2^Ōa2c̔Ҋk?&}ȓRB4;(yЀbB eJ,Vw0L29F'\Hx?M::I#ٵy|eS)ZmppU)(x> [ݎ85Ȝ,LeY: Fh3Yib,A@\=}UHo2>iMJU Kṯ_ .4_P//%q%E{EqA4}s奢Y[`Lu$(-̏vNVq34Oz_E@ 2Li=() %C0*52/hr@hfx W-&2EIt͕f%*Ȧ]2U 0Ԅ+YEyQzbK'_{ؕ?"(4bKXE^6$f~ }k3eUU}hht4)~tcp4G%^{Ȼ99AJft n=>q}e$čTG*I2LK]-~ TT| Yziu.EBF:3 @lC:k TUm%&D*) ӄe;#wןOMj(NKڪ{?˫;~,-9S>n ғHeEB$oWwy *(A]u3b>?Zhu|)`tǠR0vҧYGVT-W ~ЗJ-DI;;4^vڹr LY3L̬160#kF6~j?9,ys.2+yic͗RB}ߦteMk]LoP"ADKE'ٵ-;*=--Jni-lW pi&-IjXZ%]1 sTd"=ЮN.EX͹g7p_#'cגH6 jĊ`M I}~˩+& R^,?L_):\9*Yz>B\fm 2έʘ _3Bxt0CB8skdWX#*2eS/O+#W(Dq]`kZ5h *V:--(!PğQdB2K՞dDǑ=YG%5h^pbMhV |hjW9#_'m};Zt2;(KT#,}_ʩ~= _qzE I+O6l`?~1CQpk`9&XEwKn'p FvYP"4Qq[3ja}[΁u+Ijlټ,S`YmЁ,mFeS]ĩI0ZKv L+V Efc%2\5*bLrRRJuI 6ঙ(9Xѓl흰)郛X`?yu3)<|UU&dQ5 W|z!;dڅ *D^Jg' @Us!P,PTid^'O=9%g{Zy_+n Uޔ{6I}yg'~aWxَi6b.0Ynػ}sz%}GM1l,A[  TG)<MX>y9_vXJ{Ԓ|V9\KL\]@5HKo1C$ϞQn7=0 0_+ `rh7cBLX/=B{@ (A YJADGp3I_wIEhZp>pW.Z2ӭKGb%zQcˋ=[QEC~T50Fz[U)N>X7@u8*f9"&נo6([*A@-2@"4BS䝧Fs@q7K@h" dh8U}PrMf%ij0xB ;>s!ᐵڋ+r9KR?xuC@ە?U4Kq0B縤DqqiUPvE&h8~A{{QK~dz]{p^wy;lvyaNaw[Shɱүe1bjdcTJ<1j*-`(f,81Ɯy$ŀ"hl i;'s2f!Rip=JYŞ= :#"  -%޳u$+_fw]՗2ij/3 Y< ȒBe0ռX)9DIvVꇷlcfSU0#d ~{}Z|3}?s Oϒ[W?zuM,^a+|#Xm+Vhl@2INӛji'[5OCvZb4$QJ`;3p"ȁ$?;IA>!$oaI!9(K@q6:@g<2xʽ ʀMpzY(kCoa(!^ﴊO֢OR\i,dQD*&kzb K1z⏷cY%lvJZ^iq8j vn4"Z칣ʉ6;^:r?ĀN76ˆwE^MdK,h T"x}!1;DK>2ik1!)DXAjZQx=`F.ӄՓu6$ 8^`Y{4(^fKΣ$v]Ѩ]Vkwsu!T{0bsRroߊT$>g#ڜ(}w$6Ͽ(tmBצ/E8>U]~H8u+N-|.R卥i.|'gG{p֋2jmsqs'#vrzX~X)5^2|GM#&ĕP*I&pfPH 2ȧ-+lIV 64I*r{H%8$@ %uT{Čy+%JCD:gV\֢]yu"Nt:ݭ-b4pKօ)>v`!3n nNg ^} s J#[¿𐠼KA`[GSzlp#Iރ\0~?1C@9il >L&uB|RZE16{ROW? 9{B<`s?H}ӫ˻a@]"t$v&qb >|kx9jvr7JΤLHh&mg@<<ļdR3V{&bU fE&R Dֆ65:$SK{Qpj=2.ܣF:Y9'O.N^ly营ed%aۃ"雌Ut@4 [GpߣUYsDkXo0 bR!(deu5 :oY˚h`P)a&_`7o6Drd̻[.KMcFؑ)=E'|82jsoDp:a#4]M)6 +S&' " I`i>5+N AHj SbsY5 *ڀS+%"P_PYS_ttѩ=y[<Bo3B2GOH~ cb >J~3kS? )}iY}%!2&/>pIczUhVH;A _Hii\q (\ v-x#f1k(_7s~wuNBL~ };^Ǝ5vƎfS%/-*;CN`@{/Z!+d,BgqF7_o֠J+φُg(?q{gs祿\޿?L_3[4tEi6p3C;IX7z}ap>G( 7Tm[!?tfFKH!Fft^wƑVַ#vVΥ=zLp>/*/^T^x\;lt:y dY\*pNlVeRBW/|w4c7t`F^t0U)OoA.SS +{9mNEӞaGSauʁ08G l6as`|;d$ l*ZZpJ$ T*,iS瘜ȁ)bVh1DQt*T]OejgR8iN针]"vB}G$Q%#HD6)@v 68:C_4)5h'̹PVل )*CtP4)^hT#hR1d# gS[}<8e8΅A13Ctgc{h?ٿ} lB C!FvVM^\ə4f*ֵ*0=~&o աQ7g9F9Iв`rΖ%yQ ^jH$!PP  jY^Dc %DJfr-jrabP[9<" fc Z+Ot{:HE0dN 60Ӭwʔ,"$o"ROR23 zW.حX- |(%a./d1c>H0ket g$//WF3jRo_l*0w%eGs!3ny<+u8RZ0>^]q7d7mP!:iܫ+w}9F9p?R ~f>^_Ru[cr ˆ[zl,W05ݴ&;"tz4ONYXÎ{z)ءR+-̀=xOxԁyO;'X=6%ެD5`=p A)6' *`\fe1G, e8e$3M/.f1þ&pCl_-qaMb`^g ksy%*I:-x.ѳ'&FgŞThepD .uײFUt`3"#ώ&(3H}vh(SL:CS<6W F:4G@3Ʈ>œh읇nO)'w@60pe%{[1^gZ5 QhÂ5\l,2jKq&\pD'C kA-(C-N6kX 霃ƄВXש™q&Tl)v0' dJu`^s.8lj3]F4.41!qVN0Ki 1 yp m\*IVKϺcpXa"(lY"ON پX!g5V9ԀigrҀShIm{%;N =XIգk#V]R ]_#YDnZӱ #w5zKWT?m|kםWr_ݿm_ZM(/;~5Bnl )ϑeWlUͷnn-c'ھ~zٜ,?\^k=F4lΘnw& kniqQɑ?ξjۏՆK'@z~ ~zss?a<0K?G1AI]ݡh#Ԑq·_)efO{Hg5]~g_Zr+֞%oh.%[7NR7eK(ZqcZQ}tO]p6sS#h`x~Y+D0[mpO/{ eӬc2u}۶1}8~zfVh=>?E%#7a.<[#" HHFEH  Me9ӈ*= ԋߋBny.qZd5RpH\S+Y.lQ˛_$9@&T &؀W ^_ͱS%D6fݝTP iޖl74zlt.jR )74ɒ0Y]j`nI@)E.4+_\)4: g֫Mj R̀"/ 6BI:~:2 s@=V 9R '265Zx)PQnph;DR <:PUCyޯd*<&JxxX:-sч>#Nn3Z6G%\$tDe%#gi/$+J3UsQ_GsekWDG5U蘣^z"mԐƽ8 c}P4 TWqR :ġCv'-91\u͌z\,4w-!(&&Rz[Vc9 xA<ъ`o%yefW'6TR7՘{ST+_:4O1 NT{: SwFq-ECl <|,lf>c[27{j]su<+Q@?Ͱ_G?8 we0]-||n|juxW:=??,mSdnSLi܁8мJrZxH-)F{ )iHg8 y-uL<ÄS1v %L<;j4̶p`6( :RqJx;jMP`4䨵xL< He!Тv3nQ?}H-nTM`]pIq(7~'{JȾL[J,^6SFYok$\ѐHW_tdY<=8 7o^ (*P#c$rpzpSۢgIk;o1'r ӣt p?Wv~ϜjM5Oeu=o:ZC 8_M3o@_(\JyGDC}AJ]^-+tކ+Hչ uqx_]?Tw}#*3n|]m- UvJHpsD 4c=qՆ0pug9$x.Svy+[$8abul8Q >਄&IiwyW,y>F׌p7cBc .#?^}Ȭnܴ{b \]BJezI,.1_]ګ_+M5G㽯"!`B<̾Ig@r'?=%:.9Ap'+ɀrKN2sO:fRP!L K8%pW5q}*'* asYh*^pJVNFd$@0 4IE,8Fįj8~40*@ PΰhS\kϼT)mE:/ V\a g]]LUnth؅x)ӐL'/W^(ׅ_ws (Bu!^iY}sNS)@X&ďgDw4Hjs'ڜXfjS 9no)o g/GS]]7kjߟQ h΁ub: 4mE?Sw*S1E0FcC5 \{/GkVYC2'>iN ֕'Utͩ@Y!OF9iQt<AOF;Jrin99gުLk/Ffet"|ȢJDk,2"ICj<"!FBz_X= usdrne>(OR S#GsYݮYegII?hs}?չ͢-S Lfj&sPOb 叻B-1V }PPH9yf?\ߎNJGЎ|'jQA˽?ѦmҺMQ Ub> I? Jft8ƈ#';9g1 )-E \+-X2*颇ENf/ |GR&]hjU1Dt^Ąg 3¢#)1mjLkC8Egmy}d׈jc1Ftͅ&&-i((Du0&GvBQ3J0D,ty0Jnxu6)lkҶ4;b/yiח^/  \O$eҾґ@d,-=$''z:$x̫&D!ہ8~}3aK=hI'!!&_o X!>!qlBbe'v)聻\cE9c/D`:7?8^2YRоL$5Gӄ@'ܐ6HulW#=4cɟ~t0;dXI*ԂgJ@8IW7]+y<C@] 0f Vi~u[PI$TIdg46@D))2cSpQ`z!z!W`aDg7C2ɷ mg jEp=˴exjtLqYKAܥZt\Q. Y$ցZGcf0W .@q~*ymմ4i1)IOxjE>_8͌p޹C2!,d|u9&l:9?;kC [7%Vi6pF{,l؍Px%]4BQJ{%$E%dVD܃>?" U< xaSq$ "~kl8FF w8?ͨf E?P%mll*}8B20|qLyvr0v ^,tFR/M/8&wwjx+- NHM-oDټu77CFk)cD{59VB^J3)L(|D~qGfD1ƅrF17^ᆪ%*+W9515" 7_Yt޲Ɖu«@s:臗H[%-cF 8$Uƒ}4 5…토d 0uE]eLw ʇS1_%x媕Ri~բ0. X`5k)J;K&H'wWSDsPWac .\$6HlpQe lz3S'I4Խ`HS%N %k͏%ɟMWS.t> "PQ$Q(WC˼\s4 |o'^r͌6PHAd01;JX'W^Hcc2M((Hk4DGJȝÅV=F{ -v[,b׬;ccFI-Y5=J62(MMg%o`.J;CZQH$Lom'RS~Prdi~:@sݨqhNdžJ)Ԭ% KSؖ| V(m$Z3ZfZwNJ/I3`$kD!*J@ .PhJL1^PO3Pصs<+xkdUC w3,QJotkoߜ}iq\,<۳u>,P|KEy=V&wm͍ڳ2Sf:'\6/I@\ؖGgl~lQdIQ#%c[n51\IIIVRm K#J r,=șJ%O|JDTZ㢦#qCaׇ5ʕRe8oL8X Ve6++=,AɈl|a&h;A⠚v}S^I \#V3[yC@#b5v"6n_dF=-v1nCɵuDQtqtYkp-EOt߭[T28-MDTrZ\ڽ2栌l$%S5ɔ !y"iph>*](»1lFŗ.ӵXGR&'7\;*;4/*ykE c+yכF-ϱv^m _1z'U\6,l@mWh}x?Y?w~w/%MjWjM$NHXYx{Ƈ&\=u>4"d檐J(RV}}`9pA8jG +2f0c5BOHn2843B9B@DsB-cygV4$KAb#:cnt遧[mkj&$E4E)&} n-Iv;`xZ9i[%O4V5!!'.˔=ѷZxka t+Dr/im+WR6\g=NVg= [טTy7 rLu1ymC ʩ4Q49v\q#(0a){QGVuz(5 ~OķCL}{# 熘`o*3sigN'`ɀO2@3&fXUSnZ*$ ©EE:P|p,*T̓R&:dJ$6-O_S*^peJspŕj9/rCISZi ܂>P~|]yS 7NFeq^:, $I8hvKc+HOMECT>D8!Ud<̤DmuurM7:> \xPCDe>‡;%x@ٖ1nBr0f9m "0Bg# yoZipI^x2k|,n3rY=XSB+}H:)i`y,\4|$IWSͦXRn˯eNP-VUHu5.ep7g {NF g Jk<.:k_+$%pZ9E a'BOݙJ&τ2cgEVydL'0V V ַH9=8㝋~㸲yce'ݱvlm6l~U̿2+Ò+[x_%Z[8RPx: ~G^Xa֌<5EMMZb %9>IՌn$AkD'dBV]F[F-S\d˴Rʠ`FK*Lyw]$(V=PXbk *c,ǖgicBU P=8==ŕSϙb21!w0zMhy4J*jEߕTW"TϙA9΁|2~Vp;#n <ƑZPS^8V-ej"븨B"$$]ڽzMjRmw ^GcFm5"W=TX4UnMo`:J 8_ "Zewv4 %X?˞lK?nNY:߽-|@_u řl9Qwcxek$v Wݳvf5fxp6/JQ[ cK}<:;@g!{w+bkJ8@Aݏ<%H^Q\3 윘:5^Fl5nfW'r uruweW]sj ^+]s"yZnm^P/swBTšN) ! ¨&}}nz4fFu1(*:SI㽸cr2/ŕ'7u { q%m-9[J Zsa4nnYi<8o^BDqe4Ĺ$9\"C9˵2.NsOSxKdQ.a0\E~\VYΫqU?{zRߑwo>`%f_O|O;8spuA"ٲObFlooG7}G9c\/<>aD2p^.f:fMt T0/b-xbV_TD_Na`_ >.Onz'ׂ CqY%15*n?b|%񕦊>ő!ƒKvJJEma,YlyWSpoРQtӯYm$ߟ_W寋UUd4K9 (7{#\rEycfDaNvuZ8xirSuJorG(em]>/YZ~^o1R@LĘUoyѵAjzykY\+Rѐ_P$Lьx)|#[N1swyy|*.|AO1V$4&: #m9qm8~md >/ـo>“swuukyGe|?۴%-e E(K khS=U +g>)@|\K_MtUKx`\exYB޾<MQ`rc\ݿw[HPH JtZ+X_zS,n4π$9fK`f\)NX'3嗋/ .cp{]d1f⌬( =^]jʴtg.LK! 񋄷%7n|I&^zIC"TnލGE D7c-^WO7 Fe z, O 3?IEe.I(,ςG򜟬v2o./ku &9Wk5/-X\2_{Zm:YGƊ12{mx/{`Ez dFWާN*_DtjAjZ v jo3 bazjT{=JxX*ak|r"&/bQoI.~-Ar gځJ&Z$ RY^ E6`9@Si$q8%ұ34?=Z@ c583B rґ@TX*4 2x )<ZJMIeO0czsaRi 5L=r9DlH t{_v'c:&ev(`MnLo(KDJ:-K,B"3Gpݲ3CBy7hG>Pۘ41tgrRLq٪1Ԕݳ,v463B4䙖WT$K*l$[WU!>0R<,lCb b$Z0羇n"Tof)Ak{ >=铳LLsM[Bp)FT͑׭H&nݦkC:OC])ѰGzPJ K8Ez[񧮁^HнrVL$Nɿt;jɼ'.B(#k} zVR ND/ FǻQOX'!JobK%9?0% T<ץVNj+HS—M'ꑞ|QzVѶ"*Cq%}bΕ)2p 2Uڻ er@^T(Xs@{۟:+NQۮ5)HO`4I>_YRCLf;lI5=GqMܽ ;q2G 4ځ+s2 e.!.,^08{dΐhYCT \*БF62GB%vxh͍QZ#q.j V4xсnع/sfdp sF`9:(.we8X Ȝ:9ᠠA28SH !髶2Dܽ&ݒ99T%Xs Z-dvTA}{G2sW ^c֑LSdx1Pw52AH ^tϋ0^~G^r7yfJ{q Wh,]Ucդ'ONm˧ϟXo.]w]LaM'TF/}Qn59ݛGIKX5:&TY=MnL4+@<m)ٻ y3kN慐L5J#Tt5ͩ teB57 KUқXs:H#~oOf:tfg 7a#wЭ@WARv`μ|O&@=̞\MP˜NB&:uws6ڢhd=>d_-wێŲkr!Ok$xf|~"C>ԉgB. ۨXҲ} ;ߍ \wP!|Wvx'.Hvp: {Xh qnVϩAMs oM%pf͘OxcZglΡnR nZ`avl7o߬]k= }v@@AuhǑS}LS fN :tO_Eci|UԮ#]Łw|0zNfvM) F!$yIحU2+lсYʡ*@ZRkD 6XeԩdhE6, p;m$Fu:QDrҵ>R\2(%/SmuU?K9.P˚+!ş?\M~]̖WrqGg}f"ǩV2txrןaVka?/;ڥؒ)>t۷K#Gj]ve@Xck/SF 6NUʆFmG3zeWWlb~$[ 4."МeWBՁZjRm#$*iGcNUJ”VƊvU)BQSwߢApO9iTHk>MGwGjfY-JڱICy^/~~cX4SŤ"9A!ٸ  cg nb檲ܐS0Wb6_@dVh m.),W!ROG hDxӑp{$.o?sv^Nh%r)6?crEܑRO_>)4}_vOWFjelGdl 6%T4_w`!} ee}E c>[#x&hʛʚR(Q;[)nAPk66j4IfI}<¿ޥ/)WlRklBez=ۢ$LB$Y7ߧ>mwEN2QItĨx%@*9Xd jN\߯n#ڮ/T_hLhrr?/ROP;'ˏa{L-*`J4dwm몿СuUZHc?ϯ0}U)`dnoCI51,WQxߟOd<| eY:J"-[%s6i**TUR{#m4*tC5:QOJPtFjwnTQ^1o֟ :ޡ>;a>G~'V'1m@gdަkV:+NoMnΫI"%զq#%N+)F~NgnS-I`U{עL-Cp=%oy!N$7|0- MzqGnxqqÇ@|'2o܆dj>Nӣ3s?=MoQh k6K#Aw&$O%25irõPJ;B xk;tI6b̼퓐MO9aKש<זP%ځx9_F6Vaz..+P}W^vxCwh3q*+ksA&rA!d3qS`R{ot%Gfd-wJM}eyD0IX$[*iK)p䮮lLʢﱄ o;u2%bN)U[(""e. T|2r*@4w5¨R[yYP2\e`^ҢS3 zZGf{fW#e&3ŷEMeYѡ(yWȢQq]i!DIxTdTz (R:l" u "@YӮ,3֊@'j)-j:zJ@*f̏`mckPikF=wܠ^"AQ:JUK-~;,>Te*=VVَI!cʞ eWO`R(ۊȀY¶]X8:귎0sΥ)mrۢlFWV3qZn2Q7Szߩߴ[+:d!ʟ;c 90pJ6Gij+ˢ\usۆ]7SYz~h[%6@hӈ|V0ܻF\_S{wX`:?_w\ky%ȸx3vQRL[y뒝姯>vAC}Nr6oW?o["JzE2-J%2ŁwڹMf#Œ΅L HΙYONЊq0 Hvg=-IMh$-5y.fqƶxf0RoG\l;~Dĉ%rc Bi v84 `ƽ^3{ /H}lG aQ o~S6iq jT|<hl ^%*4ԅ4h M(i] iy!_,.fL'y}rPtrkRNsu"P+%V΃2\8 VP RGT֮2zOWX[_Iwc;mIEGy+i ~3jΧ4p ay㆓)OTއnEK>Wn)!S{{6 `ds,S @;@y&RH;__E[KZ1yĪ*s@*jwQ(u3:P"p+Vz.όߡYs1i0O̡=(TsLSKҵG'HI^N1mߡk5 :)hze9JdjdvtJR#rUmPѬpB/.jlRM[#7dTe\pkU#OEEaSYRYRYRYSAo_W 4Mŵyͭ񤎲榖?@Q֜;j9BՓwK!!WJ~BH@o&. +#? P^,ǜ,fYs_X~]uc!sUBI>I)|:3|8YFȺ&bei+ jkcEwäqSvc+([AGIjey([6Uٶ"قxm-Wp݆rP(:׸OʔȞ<.CP(Q(d-qN~6yix$B /|O\ ı,MY4 fiLwZ8Gj*IW qUtt6FN JP:PQCMoHֱBznJ榃qnO,:ŕUM¼b&]|?rKj Spt&tu7_?.)ﰌ6#a4S7_ 1Z p3ЄԆ>yB{D[BWt༊ք`=cyl|9m| fF=IF zF qʃ0"֩(z) N @Z~!}1⛪2qfS%0ݨZ[W5Qi i}AkG"+8Fx:F*׳(tSQV4K(GpdOCH=B0(Zb`(ZNG"hdGH$/d"j3L ܈P"zЫ~yaq} VռwVzG@18td -kM۫O-waup+Z[L3ytߏ;|,p_iG|DHpҷ0{eE$zJZ(ѧ;5]=}+Ѽ=[f=+Gp iZ| p3p9&WPI*ePGrry@ 7}I98=:Nuz1h8J#k _/ue; л'ٟd59LNƮo1wEb^Ҍ+Zw fFv󡘻怓-2"`y?e=iqf vZ9='|?Nݏ},3 S).6[7>l7ĘדInh"}Y UjPґ*h@|l;z/!'9o8{Ch9"eSr6 >M媔!FeSq0b돳Fd>]؋yd-A- ÕHx]*f‡U4ӿMA4[77?]~ekHP]D~2/^PSto/?:s3(_5oO4GNj27(_Jh"Wߚs;Qmh.[ k ܡ)idб #ǿ];Y }J$(P9ܭYT&᙭I1dt(3hd4x]Q5؞hǫWϒl[m׀#fuÅ'1d!3B)gWW;r$}ބg@5'ӱwh "B#(n(Dw8Fٯ417=$i^OyZ>'l4*l=!k P/'zN Z" -f4XG7hfu3*2/0:DY=QDy$Q_3KPNPcsGL0\68oZD\cDLx^k'㮖VZaAFV&tۮ0ʧ6%*Sr {wOTPK{2վ'*3S08a'*~AH {6 O4kz+A^'6TN"%ES{X»륊pnHLS.͸a 1WSthpQdV \qT9%j|F Vq|pAPjV7sҤ]Mڤ|M&Τ26jiV1*(Ce=h! [3S/A=g EL-)[UC2ΜtwTIiXf@|]ơʬJor0eDGBj&bGT(:,gKLBM3fz7,P0=MmY]=\xAlí(UTԊ3@SOPh3 Gƻ#v hxӏZK(Y*H1)KD3/ uh?^@/z=Cظ{Žq͡ol^@utVAe :U ;Բ_a69#4v1( /B@>k+ |W#e P&rtH2WbY$Bŏ<)k6MEu&E5FvKʁGF.jeΨf\,sCL#ϽQ @~>R=rJL#砟qTb)ҳF-_L6VCUmnk#xp&$y!:IսS%v=a$&$PŌhAΈHs1p TJFt4!:ZJtAo&P2xTuPsV¬aAi2fu8*4)=Uc 6Z+cBZm\d&T3%N)Pvtn7)YSc#EF&,Zh"\Ee f¥6ڃh='&SA8] um/ѡ,KC<=MMNIb6 .ڋmlԵHˬ!@Tn(lA4ؑ]QoJz,6Z- ͌! G%+1y^>>/Pȏ uƦ@O}< cymآ^0N\)<yFj͕'zEs%g=W%}q0'ڵh '!њڕ=݅nWzz)ڕVIqPDWO&cxvix79X=8Fc4ԼmϱZGObJSo W/AcKDE+b,w3kq> mj|o.>fzctnѩ;hPxՀო[ (_~*GbeDUZs JBCqMɳOZwgyIL0s* ^@Bt&s<+"_):Ϳ(XSѨ+%Tztj `}>e UNp3Ӣ|et*^Gj 5+uX -0ILzv$ ӒY;K2k}HV<Α\0qB_sr8=!rh}da4#ʬaEsbxRmjL=i#lTOb/DL*y5#폿!2/۴'7.ZH(o*NgIgIgIgMm}FWe:cBk-،X3˴);yy t>ZFk{dҁL.L4qIX b9dq~:_M}!2UYL 1l}N&~BcuH2TS hWe;HﳐPy ~}Qoكr¤ʣ\0!Z=:aKM,AꥳrOcrsN@nm6.b]@m*m&Pεz Ɇ3ԁJmI_$Uޣ|G9=$\Z˅ u@4t'@c@"M(TP &Tl u*fAǿA2W:VA˕]n:*E;;s\lX$j;m4Jkl7Ri+LtѺf$Zhڳ~6dRʦ0'6z\gq'Gv e\ˏh<<|3-hvϙysoùOb[h"J=p$е_`D+.SGueDžiU8|2/S%b௎{"h ~ݵwl,CND(:wwǸ_4״oגZٺCRcvSF&]ToULAy7ywblǿ_9JPjݵsēI&O8,$H) +L|+@>7դЌX4SGOunך IE#!-\cpZq.JiP[kW'I 86%OvX0EVxcM&`Ήyֻ)qG]ћea8ٝQf&-!UB3"gy*|`J0_an'iPTJldf} ^b,jT17֛$ͮ/ꔤM\GS¢4{2J?ZnW4] WigӞ@> ŕ5/AaHl۫z9.'?ITl*$֍P u-U$_Y-;ڰ7"`"u[DfTqtuC4rVw&ٴ@;.q~W?B\ggdY͜*BaYjqWVZa0വJHWP2 (hQ"63k cW;2Uq,ad\+ puP}hZ=dt,и7|hAhF"9!tg(;[OA|cQ<|~,j vk$$/ :ZrjuS/;UPyOkI2M-, #޽܆GKghЦ ^\D]X@5MaWXߞre#R{'=n<Ufe;W+.S]V1 \Sy/;E(eTg՝[>CõS}̬j FSyaQabi v PJ46AhDh"\lHǔwlL':kpSӬd0scdgݻ-EҚ0G*^Lnny@ (chdJO夭#bj7{&[<f=vC>Hk D3#~h)8{!ۇD㑰NI؟bD^MyV0}Z$%hHyG wJQC!uVΰ/hBCvJՂ>@(ԂԂ^=uN}(w^[ک.FN:ԡs[+ݱ4bc!>\\55FU:FB,XR`,9 wTp2W5~t t p*iTj_.2Oj4}(Qs\Ckk;˜C/L+is)tHIMl =@2Q&!=jE5Iɉ")}~GL%6U o ̫ C 4QCu]F˹DgM 0;q9| xX0lş vs`,4 9O){0Ӭ+dMn*0v>t'80"59 qbqoTj$FsU W-]~'R U?N%`hUNu-a#޸pb8'@3n2t 3{ʀiUR`@Z8蕩-i7Lea _`k0ϨKy6JF2Yo<֎oݓW" 0d<1EQyipuQ*-o2<xp9**5tA#o~ϩ%y"SKޗZJu4#$b %v>ud'v===:+莑f6T~΀FTS3t'e]9GCv6jVV#^xL̆ap^D*Ŧ[U1t] +|%Ф_[ "%m/:mn pΕXgqMUy̯X9 qQ#۞DwWK鈧WFuPϷD!x-8:z ?qG oRd4M-潼m̫\<,Le+j+(~/~v`CWa*SZQ=U4PVa)^# Aj&>n8 ޡ|V+&P~gUFsFTSO\hU r:%;szEۇN"w(1qC۳1CT:H42tJ N!/˴{t4l@=x9u:DíS>(3T!F5b߂m=2 eF6'[M*Ȧ}fج; yz:)*{KIz|sB;_*@{$z͘\STLԢA\K/A094jF3_K%-Z7k v@Q !TR/6'6hl%/ hT4'N(,4HpH׬LNw,ǯgd~ ,T&jIb3c,"Aܘ6CA>@!D] O@ΝDžn$ _-_U'x4e#.uk?gȐB̎V US&z_/t1@t:{$iާֳ$,d'b&W;Yya^2drdAGn~yO#Air yOůB%4QhYk% ?ҵΔU/4ȣ6&&[9Bp( WcCam.3GT8ҵ3P9(b=>L, 9c58; `0 tlϺ7=8~qEL_K{,CiYptn9ݣTn}B:[j('O)3R\ԃM}\؁DVY/* 4Ҁj[F :%|-V X(aGm8Nkvu.Kˋa57_7L]_ty:[B  qҪ/mфa~:^M2"-9NS-Ռlek*m^ѫ5zm۲$֞_ykԞAm7nk;W^]) ٶEl73mZYF>[~K2SC2U$#E*dzHnLV Χ[*4*4Kl, hrx]&TyJ-Q0(M|0U ͬj\x9+~.b*dq*Rzd4LQ@m sݤq6:CEҶ7-wh.'1P FeiJ CcҴ̑-aE͙\&%*.FEluN!)\4a4TR\AF "gXwg`Uqs(5wQ bȴuM/z/4G`YiAR($\bz㔯첒ؠÝME; 'l**ls/OOt_, `EQ{\&F`5 n1˯3bP²~~ fˊ7^H2zX$AS{?Ow|_[߽khi$8c[Ñ Á\2m.3ad.adu^L7a$fn"F'zP9(5fdwŘɚ17rLKu{D~w;I? dBKvU\3|*N1ٻ&m,WTPU=[8ĕL2/qxmFR)HEQ7)ĭfsFWVŃچVQ3k5$z̧Q4[NU1; s{Pq熐"B =h("䣚EHRq|lc$k,<&! ?*ժmjnZH?ך|=o?>40}ȖivAl?tF߂MgT*^]HޜM&(uIEPǕ咊?{ \p*Oy-) ݻ3Lbh@ yB g}6&l`p"J(њU$sȰB2 c}lITm^Wo*׉!t^ KDBaZNYKG%$)>9~V߽Țf[e'KL s}/K0b\<Ϝ6KW9&UP I[{ 0GUoFsYz:Bе- 7C[v~хMeP3n8t"cS;%>U0ҮW㍑oxcǛ'$$Mp,cJ0\0VLR8sLF}}rߗ\PgL[Y%XBvUjh>~u N1\J$ 4$awVXarZ p :<3)BYjVwasC};ϟ&j6/4# W* s=协oTtcusޘxZV@`Fޠn{,iC^oilWœۥ`}ZKʡZoDHX$ $z C1@ͧ l0=*ŷxŶym)P& #p(*\yvtй/KRyb+BW4>7RhgQ՗?{ CZZU`EυU }itrI:A>5:4I3a[Wh_N֊j)L 083j^Y{&2$5B>X*)qmS|Kbm ̦>-FGO~BHUU|g嫿,?&&8ݏfxYPL fNpQрiuP6y"nc9ߖfvo]J(-غSNBоG /E?4/{ǡN;~c\㲢g]߇_RجN!$f] , '6 x|V#wn/!I/^{C!; c)@K_hyLˤLMF(`zt UZl^T2 rerB* jfR^Iv2@ƻpWj;'} k#[ח 9JMReB)ccjaYdK$GTd,F<b_H$)J6M:`)VG½>`ݙ#He -;^:!@ح^())CPiTR%4G(T34'1D,<^Rp8l-wlhsG;8V]>עJ_2G7ʝʤ֓:{ttw Ѝza4#b2Ђw?U<{9_jVשJZo˥j)]u8x5~|;z(?3śo8~ǏI/zmhQ%h)ZIO MD錣f)2SЕD7.)aR/f~Ke̯Wϩ=e@fɄ!K'Uh 8-"5w KTJ%\^ @FyRQIw<5J4Vw회Ngdb)Y]AQe%Ao[m= RDuXsbKY=l+/f(xwXb@%U@Y8Tx7Q1ɱ"C}&-jsCÀu8`mK. \KA x"dFhιLXb {lh߁*!ABBp&;BB@yTԴ4=hV#z{U1:5Oi-4_t>Su'W~JbZ1q~~@Bp)nߩxxQ-hG$r2@[:Q14ID%G"q#mf I$[`7mxqAwr(Au1o?\7,!gJmrPz$/m;JE ^TYEmmVJHs"",iBh1"M`MKHŒ\ J i˹Ubjy=j!|B0Ұ,ÙV=;[{/V \P-5O5RŎfQ=l0)Ig"(@rA T#E<ǹz{)rJh 5Yiwȭfo_i*ia6&VHMD9YdnEѳ+y>q?Jc B08֯ ,W4fqAR(H3gyn7.Xe-' yj+׭;B•H~˲ML 8TGV9l++U 8a.$jzzfj, ~jt+Zfw7ijaJcf 1YyN',8ҟ ?10kg4ELewV0_MԼH`Eo9)D9 }y0McǙ(ܜ"JuճnV-*T _^kR2y[1H |1Y:pn7g ,[Ja<$B\0 {.VSBeЖa-y1R֮tXI x:P,i?(Nȱ_Y|gz5C|sՆJ ۯZRýzXt9cxY^W5C]vnkw~㢒ڌ簃uT! }E4G P<9 h_v7K Z̾T`cMF1^ =ʑ=5>=IMD'~,(YTvI392(t_%aP&vezL@jVBn)=Q׃M q4}ǫ;-"qHrk77-ց^2J "TS<jǘM/ƼF/޸qŠX}6I?YkӅ R`'ZeQG6c|b(6}֎^I>Xs؞W<#YO8>#5qgG}' 0ôumǒyw5_L4ff4w2Hhz/07 $JHb}DrEO19xvߦ\yGqUKU0$a -Ȩ(6|3"?y}ia;Z(٪pt Ȼ%uE/ρ!Jc8y퓳x@qːbf1!DJ;5ɭ??-A2GB&Y$QLcQ&)h;فl^cKU0Pul] Q9CjXg;$D!CAз"PWe؊=qT~5W3Ö^Pv0ˣb真 #@smvkر +|??DAۻ%bq&+J-UMw F-~/% h$`z\rADˮ߫Ȟ_y b92cNZ6`U,gR B&_T\NEnEٻ߸me,ÀqћiOrzGܢJоli)i@dw)jp8Cu i/ 2͕"މ,cN4# ys}9lI/,23Py݃p>uY-)Isrfo(=uÄ bJm&PTFScGgY͉q 9%KNNAZJ$ - 9 0)O2N3 r8˨B`I&e g=3TX/AxƹIK3i\a8@KjڀL9 +#1K+E18U-5K"x@Ic %T z=[i!_&D [8> yGlbB*"]u* EJټ؛!?ZsEmJikrrY4?rw3 "Iyt4xˋ30&ق!Wg+N?.?gf:aQ3bo-gH ~zpUQM5FR.p۩/=;^$q݈< B@e/GE+qKI⸶pשfZ"/g ,)ӌ8#l{ujɃNWWgQ:ԯ鞏nn?u;%q qN$;))@9)j :|^  %ǩ,U#O@H<"9 iէЯAT](˂/n>VQXVVĢj ]]nn;=^sw-Y6/ 3Qu>+bJ}x:9ְAfX:%VnB0Z1kCqe}ףXc\ k%Je;?rg48~,Xy{(%}U ǻًq殢 T}~.w]j(PPE ՜jԈ ݔf `roۊgMGA(Y.K}7"KHnY,%ϳYB+$ň -x[GNL:^4>,&˺,5 ]OIʻCb\Q]Gh0gzb/wֽ6[?xpZf EsCC?UL$Ƥ{_n*&OV<2s:BUB8>q,0&'NZ-hL1+MQG Gg@!=U罳 GT%L qe2+f_ak3^{wN$6qVw뭬Q+z_x*5$[=rx(|:2&ڈSO걑Xok'[K!Ez_朜.|s5}@D5ԅ&8Ŷ"ͭbBnV1Cm\"u]TlX1NրDeYOHZ 5-)Bv-n?wVN+>m`ejX:e,y6*=|o3bm>Ox | $Ïc{[TT,Sdz H{b%QƆ bHJlcB$U!R֋dQlGby#X%ƞ'`bH$J.(ᤠ)';IW-iPHh x-,Xx!,r%gXX@!x: %A6D/ י Ap&ɍ-?Y$}ܛy_hS}90u7P^聙-Rݯ3h6x[0tz~tFIÂRvX]8. H6tH.+U[O̊86rB6ıt )~6.u@x n'jlct$q QlDcROQqsOǏDau!QeivݢZ Ο_K{W04Y}zRp2 !ƕO(!_TxfsX^}*oDyRW2$HVn_$KmY^ug4c+")8s{w>=Ϡ&B a<Ϝ&9T68qN(V`d<ۥd>aA4YjhpҠ F LC2Xɺ!ŐI2I BJN5SZJUn+{NyBORhIRy;na:x:D*#˰Ԁ#59C3ÅRGf[Y{"LWd!$"MRUAY´6UJz V3Uz8tu)]P3+Q&MI,>_&l&mI<@*1^ 22o||Uz԰Driy6b_HF&!e1V!'bqkS,2;#>M Қo_HKj՚ S]JSq"1 PΩKôZ{R*S&̍"TލL:t~@<̇Aq٭ݛͧtbWvyPF5_~M_.?+5]F> X7@AD$$w(\zˋLg }=m3'B3C3]7|Abl;Lߐ" q6ʝp)Õ%Ii;>jqٱ2ͫdFhd*a@ڠRYqĂ)wC&-eqa؅ؤȈF:>1LRZ ChY~{ +7`:Bp~ຸ\ܗ(Y2VhOXXB8Eʨݤ|س?z*ګp|G(EKnjw*ZaCZ0ț]`Imİq>-謺Paj "jx',} ؚaR<ļwH&yF#907v%prZ-NR+5>.d=c۞JUΙRR3^agbw\ܡ$,TZtWż҄\>\P!: utlB y; n2?nV!Q`:NخtW]vPEd4TG5bzj{y[k Gc)Yi&d+''&Mf'y$y'ZJưfzS׻I#DZ|J>4DrhVsF )~ [*RͲ2.T|H&w*0hGo 40cن*wwlCVS7Z :DSRQ턶K,BϒkR-zU[LU"固)ޢrdZc}(ǁАg;K78%XnC8RήM?|?4r?9M*}nTk{!.Uh%7z3wt =+_C+Un.,5v! ]q ;3zƳouǵVn`lJV>^{h xukb j`W,E1DXiU%nZIJ n%O`ت4ROr\}P(=\Iuϟ>f w,{g,DTw7+kݽץ囇{80$X^K%߃^Cw?ne~fΦP%PPūPPgqX yY.AaiԏBo\2#ȕy@wd9(q;7+ S sI8_.Y29mV6QXbd󗈙:7wހA_Za 2!"3cb@&b&8'dR̚("g5\uO`NUR,d'p"b`y¬w}ql=;Z7'Zͫb~vuY%nt!1!ۯws@gEUc=v~ŲJܰ~M~~mlE;xi*znoF [] ރ~zrZ]/y$Y"*5-Dv_:cE ʋunϿ?i挿4)+Z6Xi= MkK;v˚׋o눯+sWN!@U;A953%pb0S7*)7p ziž{1c qa?^b {žC>sM⩞YG Jq㱔F i[!]ȲZ TiFh;Iߕ<9^ƚt%bhY7S_q!ɠx ÃNOc: 5z+DK#C%Js( |-5W\<4;]?J|+|Wzpה;@67M<6;]?bqS1ա=c< |SL[QLv?ם rSĔ,pe?64R goސxgS(AtwG1唔fkݽ.m;]Vdn[Q6qn(ܶ&p,f@Dp qTrc4n\rQQn{)G\O9H~{?x}jQ%o8ZW84^,um(cM.ѨviL %[BWu)mmxY~ݣuƱ4^RLp/¼Zxq'Y90ƣGOوklL Y\/mN&[Z\/s|?o!Kp?%s*9J7!] fx[MqʽêAwpj60kk?iI97X/z Q76)ݻ{=[LtʃRp?u?a5* ȥ5D:4\qA]k7!wgT' 3T{dmXN8CaKT^`Y !7U !hm 6Ný$MR@ո|BaNt}RvGsM^a[jJ qJ&*QɪԄ1ǽܮޔ0K%%ZY {M;{l̚I<I81#u!pEr"Ge}"%hOQ6y btpb:c 9(k)v%/4t!9&Y 9))Cuڀ/~@" 8[^s[:RYSOZR?!'ΌMe'a6$5yN,&%EM'(iᯖkK5AaOVg8|<<^il<7:W.v>l%M_>[cna@n'# /r>%VK7CH1=Gzi^ct~^xF/gCv͌M <8%z՟1>CD#*Ojq~= ی-}( [=o/K}X@(g/fxs@ go,`YkO<^rHyUg]b_77An%>C3s`ၨˆa"Hrΰ^]^Х!жNZKqӚM%sLpڵˋղ[AIY-(8%:Ejźiqd5m190cv>zd9(0i5:Ƙ2`m#3{v?eF֘rek1f'1qD42FQDD|c(z>=+5'); Z O<գ BBr*r;E<1z7o_n{2]gnֿ>7?;ePDOsFOq,VWa 0Zd^VӓA* IOvV*.1Y/x\t^UidJ{9DbQzh,ļD!U¿_T7g)F~!-V$sǔo0ѶaFkYHutIf7uuҵJxdbɉ&j %1J }|h]|LX2ݽft3TׯhLU}L#Q#n'u£q,T!}<}_|zSAku쨐׷ooN4ȹ9ԇV]/X[f~w-~O!jo%A-joAY޶.^-?nim, ~J,>p>ӣb*Mmt]}i2|uGߧÉ2$W .6B 2V5ݒ{ϜEGAP^,2W׾{tn~2bhU*l5^\\ݨa螊2ʘnDOJƘ2䬗x=7;]Oc 4Q#V&qLƱ@iBQzBM  ݮ: 'px+2SNJĦD|llcZɺuZE>-³:rd4jEq:uU 8)4z- )]PC`Tg;=;y͐FBU%wo}NqdkyL=㚧Bz+<(?0Ɣe,*Ե !&LW2~D[Vs4[O9HI4EW^S4q,EW8Emܾh8ƈEU~Q<;8U哢hyfb"bBp1_IqS[Zh9DQ O:E%!NN֍ԣu#& w(ygaS\tyIL ; HRϪ3J=:.z.F5=zh/2x?G=B-15EWqU !2 3n9φ]e-9H7^6Z׿)NaU`59_*m4,v=eTR*CDPt>A璟$H"_ j^U. IPz)iqIpT3ɋB"Йꥌhb XEI9VQL*f9Kr7h6jȒi`:4oS5 Oġ5:6rnd!SYycZ}(bǙN+ѡ+j4EKu+LKzak`<s[R^Mח$wD[9'T.H q_.t!k!sgw% o=MP%bG.M.Bٙj yR1I:Ɛ:06I_:hd|8U4LZpBf$`#0mTE]^3B=HLIkF-ګ- Y4(qbk1$ZVuvdw /`JJf>| -WLAcn| Z ocȗz Z oAȗ?| q*Χ/Yyq\O3vޟAŇ쇜1hov CuAd`٢V"&|k)5JF$:o0 BW@Ahր1d6H}3ɏ,s7pBpr |ycs/3AtU^UI.[Ž)6Q3\cwVWъgi}v2t wgnMry<* n-.)]iцb.-[-n`22 z^#T0j8Y'̅#d!b*?zdz] rw#`J荀){|w4>ŭOo:lV[0Z਱@R܁FJ1;X;ZMR5lUhQ j~OM+WQ#P=+u BH5@/[RUD礪uRuK&)`!{hȶc+i<>1K)HZ$]k8Z*WxwmA(GQqQKw 9\nu~nR,sG7)ŢT2As}?ȶfgMo O`JsTCߴ:?g50{?³@GjLFհlTGDw#FPAh.QJ1<ě$}-dC57T=eO!_BAŀW}nAƍx}D:kfvz_]ۧ Ҥwu%N`:0lsga2HBnam]άR}re $S#ҋ(E4\CY륬ɋVE3^*)"MQíY8vEL 4r0 ~<ԌsPmQfKOO)[(S3l3=_~TIU$T'9r %4esatcn"⛲TZm?-cՃvV(&}ҠhHl;1^ζi|]u^khL-z<=ҢXX0\Wv5+Q,\59\/iI6rBb~>>8㦗4_Aj}c9R2J&H:g3ULF~ $6OS-K+v*CPDM{+jPq TYt떮*tUd|""证w9PKHG*THq9m\Ybm mDR#tjvɯ1 c,AIj;T/6өK-vaIu EtLutYAu_"zwϡ3m[ z~b; ׫*U `>_ОtK,nm ?k7ЛڱT{o>x߿vɐAAAY'&yZDD; rg!A%[ GCmYߖ?d[DY{n{CO}qQx<' >(\7.(| F8 3|eDaDZGv|2秘YڟU0}<$7ܓj쪐pEm5+5{N<`Fn3"sF>1uQJ3#rKGUWTd V?dEwnV WZ^?VLAe7mrm FB K@-E#\%_fڏ;p?> pu5ƴ A@sL_5M-KQIIN)Jj̿qg,1NM&qJ!_?oP3hó7 wWzce/=*Dy1ik[+"heq%]1wL\M}7pk%<m 'rvyh% 7`\V1k. ![%U,I $wm$9_emz08ߗps`_ 3DIcKX夥۱ $"Y E+PhSpo步m 탕1(*KNȔ#G]i`+T Dw1zX rĬs~@IX|Hk/6`R%]@$~)m̠;ëAMlk$Hً0ڴH/_~|j md!sxɂz-yOүt_h߉zo~+tOWpr3IV{gIU85F~:rQ'3} -{[Dq'%?_Vel[{CgQa/^:]Qq}s Z&v@Ђ۸ֽJۀP: dz+O_e?|Џzxۼcj̾57w5+\&\g]@6kۣ]5Zy;dzft>42Rs8Gal؟j4[bɜ| - ۬ݳm*<2+ sZo l!MI.x,|kEܟjR=HA$vDd:*ǠRIH !6r@5kv52U!FIԉǡScs)7[]aQL6R=vH9#EKEJ*H.JRA2>O&cbv:85ʧhN*<\(jcj UȱR+ l-Rx@ɱFOu=LS~Tg%d(сj%`mNTUVs 2BBP*5%'b=/ْ()rtnT[X߉j!MT;#Pȝ9VT8@csb.Cdy( R'فjG(͞ Dc}',fi$پcM TbQI5uDyGL?Lg|HILJ@'#&1,Vb2.zh$(-^;p<ɤy_OUiUs]hx6B˦>U_1ʗj9d('k5ɣ .lڂ- juq#t 2.a[;.{<8L WztJ >>R[}_|O{;peA[wO|Jsh%{YS=\}@" #lK$̽Uwv4d]..^/ 4W4:1m"bFmARBu{䔒}ךcd?V_?XK,'.vYKhˁk\ އF%@D)Μk_G@jK%540 %Z=]RǚI=025v*Gȱ` t"& nف5,ilǒ> *K^4yg# lX9}Ti)@6X+TahXJi~_7,͚t8/>eڡ3ߐtIaRAFov0ZoE%8"WMo&,,aCjV\( )KMJK[rH;kV J!}CG[.J:9C@)8HA%|.QH9B_C l#(0&r Sr B:Ei}g U՞[ ~y5u™d8e?eG`Za8g p £&qVkc|I7 jRЩR)igNԕ#$W~O ~7HE5ALN?hT%`eBnϮ սg:d؀7"z11_D%/lw`#y_mgV*ao~` )ۦdٽ(8Rc2ëZsH~YE-|#*7!W${Թ{%W'6~2!Vk}}UL:lxl=L.窧YoQWO}X27{o?L_*ANoQ.8 I)j4zƔk2F'TPd! 03w  y?%H&UoaZb :-AFNJJڄQ;>TP5"xCVL)DZ 45F=<5ڴP!l #- ·d>|)(~:U 5mM6>+M9>C7(ql~ŻX 2*/fyx/9.]K[( ~tEFp|}TN[a]$JnVjϺXjU?VjcR\uɿ;Ã(:ʊԏ=Z@\3T U@W z]m|1[Co8ezYMo&isxQc&=ycSҒ)Hޱ2+]@ͼCAyKAtK uPq<|uWݭ.o'4Y)V YvCn%o,ڂ%33 (]L._?~yc/ '*u o֮A tRMQ A*-gN^li'{3OǐBPD}V%3$uTr?hR1b̬V8rwWc? &׫cz{^#sP~Ū3Ⱦ ??9mJ.:^l֔=<;Ь/NO_?t2-ռ,gjF9;t;RrIo䧳NOgm%8ٞssIbW1U(=TP3}]»Qu|e<ύpztI((>/]]h{IPeE~6ʢݙYbUjs{sgZ͎OGNwxn -WK:,x񘒍]v.2)k Z֓}TX_Du$&kUBvF!+E Qw͍9eooWFCn⭺KfK\f$O2~hJ(ZlgT&c& Ŵex 0F9-K^z'v1: J0E 1 ZE/Xb sT!c2p82j.T-ӎj%;q7:S ٕfxcK)JByChRhD)Q6lڂ|N5%ӫJ JHKxA9O6A(ӇhIE*3rq?fiW7oC*6XS1AO}۫yed9T%_r,X c_i}=Z2wC5[@ ȦNlvJ7!)TRgc40u{uk 2%˅Tq|lf  DjvWGI pr.ρW2Ȩb@Xg˒ [\=lD&N^4ƸIWFheFtm/yLf~:}{6w~vwT0'Vv䟲yy3U/ũ>}to^Y׳DQ;1YU}y25kVY& 2?^hG?rICqҩmi @ Eurƺ0G9ue%Z>4WbB}u1X\N3X }[Dև|*Yh 삦A Z2AޫM9BGzݙj^7xIDaJߛBsٮv3k.dE&H)@@ϕ"i{J;lԒ;^7((&Ÿ(},ma +!hh& ++ʕ5̹`[NR ǽ  %F4PP Xi.P[0=XfG7O1?7`,I{ ƒ38H1UMvga ֔_H7ZBUDYFj3Z d?`֔͹2--͊eX̃9jL hKyG%#+h 3֖\罪AU} g_4J;wES{%H=!ǠS(gJwSd98&5pL pL[f#~|L ${34 pde~![.NE,JDNqntpYntQ;A !_>:k:jmswolίl@J ;Tn'?yg9L&EpǑJZ&5=zhRAʛgOD*2x"{>4Wb&?nF[.RT'm!Yp7d֭ UNI7=O {mO_iR2"tm庇YKtu;S?䈔H1,㸴hozz*-FʨC56\4GcoEUeU2I稿?o S&s"E?啽hUgǿ\!SfW; 6j[ĭV='jZDQb c-nF0iyr_hO4d #L&]ex-k],1 TiXIhipJ[!TS"c|i(x%`EpJL cry]]h)Ԗ2.jf8swYV<1- `gP‚5Fcua(CV& EFTofPpAYBVSǛM4w[`,~bBAR:fX3H Q4D , 7Y<laˠQo.oDt4ܚ%(xYR=p(NA)M! D?ePq&D V2hID(G uq[z1ÃTSHTG5?G}+&+"Q/Q2BM$ve 8{a971K{$V@@-=| L[fgE[lQ>g0齼7뽰ѻ?Gۛ@K"FI'xJ$:Ibʷf,I '/aH$=3n_%zw UElҽXŊ\4˕ީ~x*FT{"nzF9ru&_K%b2uh"1{   P5`4Xn~vūo z\b+*| 5wr/?U\5 U˘pvwɺ> ~`BiPDMخ:$ZU.=D#kVRTEDAo۵jUDؓ.baʌK$q R:lə:0BrR0Yp$AeuԐy0o,lc7♁'g?S#BM%R}Ut_Ճpg-&!Yqd0yWf2wEY`H1qyD3BE{7(̎;O] H[ᜪrrwwlɌ]oHJpE&}Ѥұs`;w_fM㶺6A =T$n2B!j/dI63R3{@"`BB~cڱ@׹#7Z6;3l];*2=nݗˈCQ -WʇE uoU_X-HZ z3PwS_}=7rL&O}3||oZT1?d!1 gk;:rNt|r?N97ltd<¶ݯ$#Ј')l{D]lD$21HgnqH\Ԅ!zZ>L8 ¹3[8/^M:WmS.Nm έ$!οn6-n[1GA~ NG;&t)ӤދnjtuCyW"0SX"4-]PZKIRn&+gz,l>h?Ǧb/oL܉MH^4Iu@jucQO|vo{̕z<>kH卿Ycr|dg7@Kx"Fkd=ez\R?O_myl;#!B%< \kS6-WYNlKT j1 Dpk#6*SRkRK_}(l}-04XcK\Q=D*C J 1U'ۻ*[W_?Osz]v< IiN=m4-;$"kEFF>V2i-ɸԧݐ" D"L"ȄP8hւZŌQPY@)cM פ$ARb=nɎe{PvHV <38nxb;8,֖%ZjɒLqV5dXdvBA[ ) s((%.FI6waDR"LنNyk"dS%x & ]l2t+%ѨM]I@PR;6a$(Xb49r%-gN1Ƣ)p"Rk:0OMGooePN˔+jRghɪSNoy=%EZsQ(MnOs"/g@h渡`fo= yrXp3'e~{!Pc|/\B(+̝ÀX%;:a? ({h7|nk&*4ky4z{,v޻ ~>Yy˺?AR̽69Cvog' /, nyhL|G @o>>uDb,*Ѵ`gQ>$XAG?~1ovz̤rñ 5Jrથ ) jg vYJ}c.O6ŕpr䲜'Bp9>r*,hkwBf2pTnUFDXG:p..'7'dr%S"#(1RtѢ.չߪ"[E9bI./=#FM2v{ŋ\ɪXi藆z4\Vx6/M sBT_Z(/l >cۢY+#1)ՖMd[86Zuc([#[cZ@J댝O?uvO#H:zV{\&m:,qu!KBpq~.&& !DgLJoֵ&0a=]O@>BcWX֣;]YP`A:xr^'ɗ_(tI"Za"@[Km5˱@+Qr(Dc6zf_ZI9n+T6)Btіj^_Y+$X`.OB*G0oM>R͡ó ((,!ry/_˱8Z񽞽2ԯ[V%ֈӯgǗ?{◎aEPx;AwJJoǗ7 H+3CKc8NZW#ϞSsUlǗuGw xh`XXI2dV =op;bBT,ޡ :IˣNF3GED]p%GaM!l?z2ۑ'y _'AhcԎ]J Azf6B:T@i)GϽw>zOA/G5z#da""3CpJb|rԋ?{$OߏrLqًi?a{H*[l 2]ԊԶX?YuF[!ȿAҺu&dQ3slof킱*q;+\DY;R~|΂!gݶre:Ybs pwjˑIcj@*PѯUS7;Yu2&+lQ5stz P{oNڤ6VPkocR iDVm>QB//ҝR+CL=BǍ C)Nx3о-uGwft,eEe:Nn;pAmP͜M+{D&ģrz5*X_(odr씣gC;O|Ns0orDҭrG8p}TPѢ{Q$75hS>hz j!1PqRšE#-/ZwF,xSn%Fͭn$wO9qϣxEIոRvoePҹr6V|@KLL98PC JIuS)CccVQ:~OxoFKi9khrMԸr#c2A9D$eP)4vDr៓иLLr7E9+G]"^@X77]n/z'3-OrWO_Ok3n.w*4G.Qe0]nq$3z2k.2v sxl 66]Ί|v=Z icƦ-x3].3 剄QID9 >?1]ey):([7܂V>O@~jn_ۇ1]A-o9AR9CN6-CjؕH[v1,-o{ Y-Q޴- m(qݢF)ZlWy8ޝ^LҬn|x +n ݇w/WPS@rg@y@+;!غ:.{դgߖ?(?G(Eb0))yepl uoK,=m2LQk/2''/Y6Ǭpps'fd=z3Kj? \rc]9XP{W23؋ǩD}o8\ޢ]s"5ĝ>w 9vV3o%FQjZdj0tYTF9Ur)zST1\ 2jiS"!rP ^ŵjHrcHd $$jIn#ru*/ܕ֒Pb2Ff$5Z5hJAZ'xF8$K4l&4Q/ጪv0뤒 eDZQi3ɮ& +-*-8:>H;Q]Y^> [ZKJJE*I V+2ڂ6K&F@ȟPv2f1\h3w: CJN])F[,bűYbr.HIIgֆ.嘍LIe*T{mV1iM2uߤec^+$')]r T9Eh VLX9GL Q;m\A<{H,wNN߽PH-Ϳ[V^ü86i{7T~AC=< ˛ڋWc[ԒṔtۄo|߿Z82ߜE,e{7ĝ(B%lEOi/ÝD꜂V.:R]JSdر;B#@ۺ^?#XŎ` pQbS-_6v)lRGWQyY4%6n5$Dd"\#Ṕq]J3ߡ~@4ui)U6ǻ:4T[3|KԊqs#nKw䝷(W"j;$yC9~_d̦2ԊRt﹚ϱ7Vq|;u׳%\~9FEH.*{҆?!aؕ1^Q}Vav9CiQhq3&.`VA#ƼXx7'p0M(74v X_.9NkfvĬ4mU:syï9^:|\56Z689<\9+|S4饉bӘޝu[@4[^tͬ N+Y)pl+IH[HZ z݂%wO\wns]^ՏuNgNa'is 7 ޸(cC8F6af"}n\`f{6.}[}+xp Tv_7v0*PO±D$a!ĠC9<ځ+gi-TA}rFk$'XCj3+ [@np~%}@ۻ޶$By$Y &> hcR2bVEM6%I0@&)JrE6ՂP'+8y[LWjńrX{/Og(s1N 2 lr;_nG)5w+ȑ4+&ޭ dvN~ ӻu!߸~ƹݴD }G.B"B/a=[Nyb5P;W86kxn6x#(4Wa_͖w-n5 @tbcE ]#~Qw1Q}C5S;׋➆!DW<j3W}Bs[C[պ|>PF[RqBqA>!|BSN4CʸG M&uU-(ՄӁF']hfټzH:eX]L|w<x3XwJ)5c9Sl,+l1Z4z}/0505ֻFV=h w+}kdȚHTc)a6AqyCCGŲT" 1"͘H8sz gc 6A]wB{NhsdkD`'հ@!T_ǯ;Wh$iy| C:0DM\!>x٦Z0.p->wB[~@ ..w-xߺАo\ECtJ&CnN6Z&~J4`Bm!:RQ-틾kKRK sIM|_Nj)2Qq{)QXy[3{79z4xr>? S͊Y͇q]vT_nbr}5cƘ1v㒯IP_\{k,[$i̧[P'sW%hW>n"ֲR}ʬ>J<|UO(_DZ9SɚU[eB~^ҿM>ZhDne"F,3IJwZB6lI϶XLC^ ]ʹڃ4>\'fKùY>kAIوXEn nC3Bx̮7_NhLßh"$.)Vy}YIB=ǒa*9:Gv>:x0_i8]~5^dϻ[[b5-gŸeikX(B%Ia'n D CBnd`9Pd* >**{ff>=jrk09BM&wR^_F-"72bDLG$6ʤ$5K4B^=o+խ$$^є/P b(1QL)2XNcB,9$D&ZO3-wXNL2q (>]߁| t|D̵q(е` k&1 6td y&smvLkZʰŤN*!pF]C|)0LI0*nuԮ]WJTk"͍Ƴf/`8b7xZLkHHD:J$qPE&uD8uP ڗm|[YZ@"0T.l)đngVxt^ޠFI >?{1M1)ɐ܏͉A}(콵 >f)<aOŮ>~0Q~WWdkL^l\/5ȵ#Mݶ#4ZcJISi[T7VX+o!Ld&3nؓZLIaRR' t$46 3m[sjy1=7 Csj޼9u+1ˍ; EIG| Ds* Q˗͋ǩ٦cxt0u xo%H j#Bհ6OT$Oo~8ai@Q ܂*{sMmGIõ)Ԭ_[?,v>6W׵,/F'yfN+%&9op0Q:KmdD=pfgvys"\T3t0)6>~>a^#5A+Ri%\1q&VFD8A<فwNY WdtHn"_X¢'s޻=F3FLRFRᶆݕ,,)I=-tu!J 8/JnBqz(֢ \b얳,@vgl,4*b HDŽDI2n> {a^'d'[NǤ̍vLq#SVSmUJt M2II8 -vύ1T7k|?&Zkx?@;jR7j>;3͓ IznE6ۛ2Y3ᝨcim~s](X}var/home/core/zuul-output/logs/kubelet.log0000644000000000000000003743543515150471357017723 0ustar rootrootFeb 28 04:09:36 crc systemd[1]: Starting Kubernetes Kubelet... Feb 28 04:09:36 crc restorecon[4693]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 04:09:36 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 04:09:37 crc restorecon[4693]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 04:09:37 crc restorecon[4693]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 28 04:09:38 crc kubenswrapper[5072]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 28 04:09:38 crc kubenswrapper[5072]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 28 04:09:38 crc kubenswrapper[5072]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 28 04:09:38 crc kubenswrapper[5072]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 28 04:09:38 crc kubenswrapper[5072]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 28 04:09:38 crc kubenswrapper[5072]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.409104 5072 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419258 5072 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419318 5072 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419333 5072 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419345 5072 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419362 5072 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419374 5072 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419386 5072 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419400 5072 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419413 5072 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419426 5072 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419443 5072 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419459 5072 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419473 5072 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419486 5072 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419499 5072 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419510 5072 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419534 5072 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419550 5072 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419564 5072 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419576 5072 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419588 5072 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419602 5072 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419614 5072 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419624 5072 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419711 5072 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419732 5072 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419746 5072 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419758 5072 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419769 5072 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419780 5072 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419794 5072 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419807 5072 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419822 5072 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419833 5072 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419844 5072 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419855 5072 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419870 5072 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419881 5072 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419892 5072 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419903 5072 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419915 5072 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419925 5072 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419934 5072 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419944 5072 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419959 5072 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419970 5072 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419984 5072 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.419998 5072 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.420012 5072 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.420024 5072 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.420038 5072 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.420049 5072 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.420060 5072 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.420071 5072 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.420082 5072 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.420093 5072 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.420105 5072 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.420116 5072 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.420129 5072 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.420141 5072 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.420152 5072 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.420164 5072 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.420175 5072 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.420185 5072 feature_gate.go:330] unrecognized feature gate: Example Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.420193 5072 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.420201 5072 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.420210 5072 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.420218 5072 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.420229 5072 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.420237 5072 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.420245 5072 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.422769 5072 flags.go:64] FLAG: --address="0.0.0.0" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.422826 5072 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.422850 5072 flags.go:64] FLAG: --anonymous-auth="true" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.422868 5072 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.422884 5072 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.422902 5072 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.422921 5072 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.422937 5072 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.422951 5072 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.422961 5072 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.422972 5072 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.422984 5072 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.422996 5072 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423006 5072 flags.go:64] FLAG: --cgroup-root="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423017 5072 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423027 5072 flags.go:64] FLAG: --client-ca-file="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423037 5072 flags.go:64] FLAG: --cloud-config="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423047 5072 flags.go:64] FLAG: --cloud-provider="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423057 5072 flags.go:64] FLAG: --cluster-dns="[]" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423074 5072 flags.go:64] FLAG: --cluster-domain="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423083 5072 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423094 5072 flags.go:64] FLAG: --config-dir="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423105 5072 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423115 5072 flags.go:64] FLAG: --container-log-max-files="5" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423129 5072 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423139 5072 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423150 5072 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423161 5072 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423174 5072 flags.go:64] FLAG: --contention-profiling="false" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423185 5072 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423197 5072 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423208 5072 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423219 5072 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423234 5072 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423245 5072 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423255 5072 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423264 5072 flags.go:64] FLAG: --enable-load-reader="false" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423275 5072 flags.go:64] FLAG: --enable-server="true" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423285 5072 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423297 5072 flags.go:64] FLAG: --event-burst="100" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423307 5072 flags.go:64] FLAG: --event-qps="50" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423317 5072 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423327 5072 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423338 5072 flags.go:64] FLAG: --eviction-hard="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423350 5072 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423360 5072 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423370 5072 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423380 5072 flags.go:64] FLAG: --eviction-soft="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423390 5072 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423400 5072 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423411 5072 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423420 5072 flags.go:64] FLAG: --experimental-mounter-path="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423431 5072 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423442 5072 flags.go:64] FLAG: --fail-swap-on="true" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423452 5072 flags.go:64] FLAG: --feature-gates="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423464 5072 flags.go:64] FLAG: --file-check-frequency="20s" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423474 5072 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423485 5072 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423496 5072 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423506 5072 flags.go:64] FLAG: --healthz-port="10248" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423516 5072 flags.go:64] FLAG: --help="false" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423526 5072 flags.go:64] FLAG: --hostname-override="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423537 5072 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423548 5072 flags.go:64] FLAG: --http-check-frequency="20s" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423558 5072 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423568 5072 flags.go:64] FLAG: --image-credential-provider-config="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423577 5072 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423587 5072 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423597 5072 flags.go:64] FLAG: --image-service-endpoint="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423608 5072 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423618 5072 flags.go:64] FLAG: --kube-api-burst="100" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423627 5072 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423638 5072 flags.go:64] FLAG: --kube-api-qps="50" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423679 5072 flags.go:64] FLAG: --kube-reserved="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423691 5072 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423701 5072 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423712 5072 flags.go:64] FLAG: --kubelet-cgroups="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423721 5072 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423731 5072 flags.go:64] FLAG: --lock-file="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423741 5072 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423751 5072 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423761 5072 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423776 5072 flags.go:64] FLAG: --log-json-split-stream="false" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423786 5072 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423796 5072 flags.go:64] FLAG: --log-text-split-stream="false" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423806 5072 flags.go:64] FLAG: --logging-format="text" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423815 5072 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423825 5072 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423835 5072 flags.go:64] FLAG: --manifest-url="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423845 5072 flags.go:64] FLAG: --manifest-url-header="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423858 5072 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423868 5072 flags.go:64] FLAG: --max-open-files="1000000" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423880 5072 flags.go:64] FLAG: --max-pods="110" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423890 5072 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423904 5072 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423915 5072 flags.go:64] FLAG: --memory-manager-policy="None" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423925 5072 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423935 5072 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423945 5072 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423955 5072 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423981 5072 flags.go:64] FLAG: --node-status-max-images="50" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.423991 5072 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424000 5072 flags.go:64] FLAG: --oom-score-adj="-999" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424010 5072 flags.go:64] FLAG: --pod-cidr="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424020 5072 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424036 5072 flags.go:64] FLAG: --pod-manifest-path="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424046 5072 flags.go:64] FLAG: --pod-max-pids="-1" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424057 5072 flags.go:64] FLAG: --pods-per-core="0" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424066 5072 flags.go:64] FLAG: --port="10250" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424076 5072 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424086 5072 flags.go:64] FLAG: --provider-id="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424096 5072 flags.go:64] FLAG: --qos-reserved="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424106 5072 flags.go:64] FLAG: --read-only-port="10255" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424116 5072 flags.go:64] FLAG: --register-node="true" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424125 5072 flags.go:64] FLAG: --register-schedulable="true" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424136 5072 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424172 5072 flags.go:64] FLAG: --registry-burst="10" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424182 5072 flags.go:64] FLAG: --registry-qps="5" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424191 5072 flags.go:64] FLAG: --reserved-cpus="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424201 5072 flags.go:64] FLAG: --reserved-memory="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424213 5072 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424223 5072 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424234 5072 flags.go:64] FLAG: --rotate-certificates="false" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424244 5072 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424253 5072 flags.go:64] FLAG: --runonce="false" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424263 5072 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424274 5072 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424284 5072 flags.go:64] FLAG: --seccomp-default="false" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424294 5072 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424304 5072 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424316 5072 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424327 5072 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424338 5072 flags.go:64] FLAG: --storage-driver-password="root" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424348 5072 flags.go:64] FLAG: --storage-driver-secure="false" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424358 5072 flags.go:64] FLAG: --storage-driver-table="stats" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424368 5072 flags.go:64] FLAG: --storage-driver-user="root" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424378 5072 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424388 5072 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424398 5072 flags.go:64] FLAG: --system-cgroups="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424408 5072 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424423 5072 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424434 5072 flags.go:64] FLAG: --tls-cert-file="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424446 5072 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424462 5072 flags.go:64] FLAG: --tls-min-version="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424484 5072 flags.go:64] FLAG: --tls-private-key-file="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424503 5072 flags.go:64] FLAG: --topology-manager-policy="none" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424521 5072 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424535 5072 flags.go:64] FLAG: --topology-manager-scope="container" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424548 5072 flags.go:64] FLAG: --v="2" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424565 5072 flags.go:64] FLAG: --version="false" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424581 5072 flags.go:64] FLAG: --vmodule="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424596 5072 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.424609 5072 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.424967 5072 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425015 5072 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425021 5072 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425027 5072 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425035 5072 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425045 5072 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425050 5072 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425055 5072 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425059 5072 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425064 5072 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425070 5072 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425074 5072 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425078 5072 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425082 5072 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425086 5072 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425090 5072 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425094 5072 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425101 5072 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425126 5072 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425131 5072 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425135 5072 feature_gate.go:330] unrecognized feature gate: Example Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425139 5072 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425144 5072 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425148 5072 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425151 5072 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425155 5072 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425158 5072 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425162 5072 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425165 5072 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425169 5072 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425173 5072 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425179 5072 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425182 5072 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425186 5072 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425189 5072 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425193 5072 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425196 5072 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425201 5072 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425204 5072 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425208 5072 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425211 5072 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425217 5072 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425221 5072 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425225 5072 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425229 5072 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425233 5072 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425237 5072 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425240 5072 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425244 5072 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425248 5072 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425252 5072 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425257 5072 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425261 5072 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425264 5072 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425269 5072 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425272 5072 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425276 5072 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425280 5072 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425284 5072 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425288 5072 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425291 5072 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425295 5072 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425298 5072 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425303 5072 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425307 5072 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425310 5072 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425314 5072 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425317 5072 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425322 5072 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425326 5072 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.425329 5072 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.425352 5072 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.436923 5072 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.436983 5072 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437143 5072 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437169 5072 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437180 5072 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437190 5072 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437199 5072 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437206 5072 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437212 5072 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437219 5072 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437225 5072 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437231 5072 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437238 5072 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437244 5072 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437252 5072 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437259 5072 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437267 5072 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437275 5072 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437283 5072 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437291 5072 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437299 5072 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437305 5072 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437312 5072 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437319 5072 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437327 5072 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437335 5072 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437341 5072 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437348 5072 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437354 5072 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437363 5072 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437369 5072 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437376 5072 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437382 5072 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437388 5072 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437395 5072 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437401 5072 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437408 5072 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437415 5072 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437422 5072 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437429 5072 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437435 5072 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437442 5072 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437448 5072 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437456 5072 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437462 5072 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437469 5072 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437475 5072 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437483 5072 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437490 5072 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437497 5072 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437503 5072 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437512 5072 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437520 5072 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437527 5072 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437536 5072 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437545 5072 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437553 5072 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437561 5072 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437570 5072 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437578 5072 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437584 5072 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437591 5072 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437599 5072 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437607 5072 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437615 5072 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437622 5072 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437629 5072 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437637 5072 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437666 5072 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437672 5072 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437679 5072 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437685 5072 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437692 5072 feature_gate.go:330] unrecognized feature gate: Example Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.437704 5072 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437915 5072 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437926 5072 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437933 5072 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437940 5072 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437946 5072 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437955 5072 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437962 5072 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437968 5072 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437974 5072 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437981 5072 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437988 5072 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.437997 5072 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438005 5072 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438012 5072 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438018 5072 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438024 5072 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438031 5072 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438037 5072 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438043 5072 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438052 5072 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438061 5072 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438069 5072 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438076 5072 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438082 5072 feature_gate.go:330] unrecognized feature gate: Example Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438090 5072 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438096 5072 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438102 5072 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438108 5072 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438114 5072 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438120 5072 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438126 5072 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438132 5072 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438140 5072 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438148 5072 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438156 5072 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438164 5072 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438171 5072 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438180 5072 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438187 5072 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438196 5072 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438202 5072 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438207 5072 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438213 5072 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438221 5072 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438228 5072 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438235 5072 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438241 5072 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438248 5072 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438255 5072 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438261 5072 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438268 5072 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438275 5072 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438283 5072 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438290 5072 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438296 5072 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438303 5072 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438309 5072 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438316 5072 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438322 5072 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438328 5072 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438335 5072 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438344 5072 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438350 5072 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438356 5072 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438362 5072 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438368 5072 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438373 5072 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438378 5072 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438383 5072 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438389 5072 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.438395 5072 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.438405 5072 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.438720 5072 server.go:940] "Client rotation is on, will bootstrap in background" Feb 28 04:09:38 crc kubenswrapper[5072]: E0228 04:09:38.443670 5072 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.447246 5072 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.447372 5072 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.449171 5072 server.go:997] "Starting client certificate rotation" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.449200 5072 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.449399 5072 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.475037 5072 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 28 04:09:38 crc kubenswrapper[5072]: E0228 04:09:38.478998 5072 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.479444 5072 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.500390 5072 log.go:25] "Validated CRI v1 runtime API" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.535315 5072 log.go:25] "Validated CRI v1 image API" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.537574 5072 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.543051 5072 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-28-04-05-41-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.543116 5072 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.566253 5072 manager.go:217] Machine: {Timestamp:2026-02-28 04:09:38.562470036 +0000 UTC m=+0.557200248 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:05edca7b-62f1-4864-9cd6-627477cf26a7 BootID:c99101c4-599a-4ac8-9800-c4679859c59e Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:aa:de:33 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:aa:de:33 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:02:99:74 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:d9:32:59 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:6f:59:01 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:7b:22:30 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:52:57:3d:a6:7c:30 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:16:94:3a:a4:8a:fe Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.566504 5072 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.566668 5072 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.568821 5072 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.569003 5072 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.569038 5072 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.569318 5072 topology_manager.go:138] "Creating topology manager with none policy" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.569332 5072 container_manager_linux.go:303] "Creating device plugin manager" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.569799 5072 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.569828 5072 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.570538 5072 state_mem.go:36] "Initialized new in-memory state store" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.570625 5072 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.574189 5072 kubelet.go:418] "Attempting to sync node with API server" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.574212 5072 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.574248 5072 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.574263 5072 kubelet.go:324] "Adding apiserver pod source" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.574275 5072 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.577889 5072 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.579926 5072 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.581284 5072 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Feb 28 04:09:38 crc kubenswrapper[5072]: E0228 04:09:38.581384 5072 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.581589 5072 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Feb 28 04:09:38 crc kubenswrapper[5072]: E0228 04:09:38.581678 5072 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.582362 5072 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.584054 5072 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.584079 5072 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.584087 5072 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.584095 5072 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.584108 5072 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.584116 5072 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.584123 5072 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.584133 5072 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.584141 5072 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.584150 5072 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.584186 5072 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.584195 5072 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.586483 5072 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.586959 5072 server.go:1280] "Started kubelet" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.588487 5072 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.588494 5072 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.589674 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Feb 28 04:09:38 crc systemd[1]: Started Kubernetes Kubelet. Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.589727 5072 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.591296 5072 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.591353 5072 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.591996 5072 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.592012 5072 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.592446 5072 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.593972 5072 server.go:460] "Adding debug handlers to kubelet server" Feb 28 04:09:38 crc kubenswrapper[5072]: E0228 04:09:38.594634 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.594812 5072 factory.go:55] Registering systemd factory Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.594842 5072 factory.go:221] Registration of the systemd container factory successfully Feb 28 04:09:38 crc kubenswrapper[5072]: E0228 04:09:38.601258 5072 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" interval="200ms" Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.601197 5072 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Feb 28 04:09:38 crc kubenswrapper[5072]: E0228 04:09:38.601360 5072 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.601425 5072 factory.go:153] Registering CRI-O factory Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.601452 5072 factory.go:221] Registration of the crio container factory successfully Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.601567 5072 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.601604 5072 factory.go:103] Registering Raw factory Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.601634 5072 manager.go:1196] Started watching for new ooms in manager Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.602699 5072 manager.go:319] Starting recovery of all containers Feb 28 04:09:38 crc kubenswrapper[5072]: E0228 04:09:38.602423 5072 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.5:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18984d9a533a6136 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:38.586927414 +0000 UTC m=+0.581657606,LastTimestamp:2026-02-28 04:09:38.586927414 +0000 UTC m=+0.581657606,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610049 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610097 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610111 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610122 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610131 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610140 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610148 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610157 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610169 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610177 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610186 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610195 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610204 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610216 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610225 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610234 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610245 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610253 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610262 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610274 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610283 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610293 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610302 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610310 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610345 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610354 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610449 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610476 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610486 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610495 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610505 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610531 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610542 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610553 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610562 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610572 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610581 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.610591 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614050 5072 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614082 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614099 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614108 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614119 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614130 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614141 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614153 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614162 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614172 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614181 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614190 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614198 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614207 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614248 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614292 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614303 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614339 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614353 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614365 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614383 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614393 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614403 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614412 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614421 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614430 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614440 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614453 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614467 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614479 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614488 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614498 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614507 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614515 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614526 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614534 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614546 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614559 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614573 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614586 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614595 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614604 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614614 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614622 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614633 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614660 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614669 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614678 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614688 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614698 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614708 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614718 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614728 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614737 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614748 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614757 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614766 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614775 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614783 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614793 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614803 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614811 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614820 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614828 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614838 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614846 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614856 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614870 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614880 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614891 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614901 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614909 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614917 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614926 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614936 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614945 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614955 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614966 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614975 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614985 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.614994 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615002 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615014 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615027 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615042 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615057 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615067 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615077 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615087 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615098 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615107 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615117 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615132 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615144 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615154 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615165 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615177 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615188 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615200 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615210 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615252 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615264 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615275 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615285 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615295 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615305 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615316 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615326 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615340 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615350 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615360 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615370 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615380 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615401 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615413 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615423 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615433 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615444 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615456 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615466 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615476 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615486 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615497 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615508 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615519 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615530 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615541 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615552 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615561 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615572 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615583 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615594 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615606 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615617 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615628 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615664 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615681 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615694 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615707 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615744 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615756 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615766 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615776 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615786 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615796 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615806 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615817 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615826 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615836 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615846 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615856 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615866 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615875 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615885 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615894 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615907 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615916 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615926 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615938 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615947 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615957 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615967 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615978 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615987 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.615997 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.616006 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.616018 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.616031 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.616042 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.616055 5072 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.616065 5072 reconstruct.go:97] "Volume reconstruction finished" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.616074 5072 reconciler.go:26] "Reconciler: start to sync state" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.625884 5072 manager.go:324] Recovery completed Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.637218 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.639032 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.639073 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.639086 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.640503 5072 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.640537 5072 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.640572 5072 state_mem.go:36] "Initialized new in-memory state store" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.655300 5072 policy_none.go:49] "None policy: Start" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.655764 5072 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.656299 5072 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.656329 5072 state_mem.go:35] "Initializing new in-memory state store" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.657620 5072 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.657682 5072 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.657710 5072 kubelet.go:2335] "Starting kubelet main sync loop" Feb 28 04:09:38 crc kubenswrapper[5072]: E0228 04:09:38.657864 5072 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 28 04:09:38 crc kubenswrapper[5072]: W0228 04:09:38.659495 5072 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Feb 28 04:09:38 crc kubenswrapper[5072]: E0228 04:09:38.659576 5072 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Feb 28 04:09:38 crc kubenswrapper[5072]: E0228 04:09:38.694953 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.712957 5072 manager.go:334] "Starting Device Plugin manager" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.713030 5072 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.713049 5072 server.go:79] "Starting device plugin registration server" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.713583 5072 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.714013 5072 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.714190 5072 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.714296 5072 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.714310 5072 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 28 04:09:38 crc kubenswrapper[5072]: E0228 04:09:38.723329 5072 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.758025 5072 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.758151 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.759846 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.759883 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.759894 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.760024 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.760225 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.760266 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.760798 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.760812 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.760824 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.760832 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.760834 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.760845 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.760970 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.761145 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.761175 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.761779 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.761814 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.761830 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.761923 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.762140 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.762194 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.762982 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.763012 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.763023 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.763079 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.763104 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.763115 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.763122 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.763138 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.763146 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.763257 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.763342 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.763377 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.763883 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.763917 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.763931 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.764082 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.764101 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.764118 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.764128 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.764134 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.764703 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.764726 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.764736 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:38 crc kubenswrapper[5072]: E0228 04:09:38.802617 5072 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" interval="400ms" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.814268 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.815909 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.815943 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.815953 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.815977 5072 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 04:09:38 crc kubenswrapper[5072]: E0228 04:09:38.816448 5072 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.5:6443: connect: connection refused" node="crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.820050 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.820090 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.820112 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.820129 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.820147 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.820180 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.820210 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.820235 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.820282 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.820311 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.820338 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.820383 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.820421 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.820462 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.820532 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.922199 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.922251 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.922274 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.922292 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.922314 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.922333 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.922349 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.922365 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.922383 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.922400 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.922393 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.922407 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.922452 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.922455 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.922427 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.922499 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.922502 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.922460 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.922476 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.922418 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.922456 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.922544 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.922395 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.922561 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.922577 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.922596 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.922602 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.922605 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.922577 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 04:09:38 crc kubenswrapper[5072]: I0228 04:09:38.922674 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 28 04:09:39 crc kubenswrapper[5072]: I0228 04:09:39.017593 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:39 crc kubenswrapper[5072]: I0228 04:09:39.018780 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:39 crc kubenswrapper[5072]: I0228 04:09:39.018821 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:39 crc kubenswrapper[5072]: I0228 04:09:39.018832 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:39 crc kubenswrapper[5072]: I0228 04:09:39.018853 5072 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 04:09:39 crc kubenswrapper[5072]: E0228 04:09:39.019293 5072 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.5:6443: connect: connection refused" node="crc" Feb 28 04:09:39 crc kubenswrapper[5072]: I0228 04:09:39.083069 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 28 04:09:39 crc kubenswrapper[5072]: I0228 04:09:39.090300 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 28 04:09:39 crc kubenswrapper[5072]: I0228 04:09:39.113548 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:09:39 crc kubenswrapper[5072]: W0228 04:09:39.138502 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-4497a7c0b05ffd7f5b4e8dd7ad708a9ede4f7289a230f36e9de2509045253d5f WatchSource:0}: Error finding container 4497a7c0b05ffd7f5b4e8dd7ad708a9ede4f7289a230f36e9de2509045253d5f: Status 404 returned error can't find the container with id 4497a7c0b05ffd7f5b4e8dd7ad708a9ede4f7289a230f36e9de2509045253d5f Feb 28 04:09:39 crc kubenswrapper[5072]: I0228 04:09:39.139173 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 04:09:39 crc kubenswrapper[5072]: I0228 04:09:39.146497 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 28 04:09:39 crc kubenswrapper[5072]: W0228 04:09:39.173964 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-1644f3a55d8634f8d06e71ad9d71f250f268bc1b43b398623032748efdee9c6a WatchSource:0}: Error finding container 1644f3a55d8634f8d06e71ad9d71f250f268bc1b43b398623032748efdee9c6a: Status 404 returned error can't find the container with id 1644f3a55d8634f8d06e71ad9d71f250f268bc1b43b398623032748efdee9c6a Feb 28 04:09:39 crc kubenswrapper[5072]: E0228 04:09:39.203470 5072 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" interval="800ms" Feb 28 04:09:39 crc kubenswrapper[5072]: I0228 04:09:39.419461 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:39 crc kubenswrapper[5072]: I0228 04:09:39.421100 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:39 crc kubenswrapper[5072]: I0228 04:09:39.421182 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:39 crc kubenswrapper[5072]: I0228 04:09:39.421195 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:39 crc kubenswrapper[5072]: I0228 04:09:39.421223 5072 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 04:09:39 crc kubenswrapper[5072]: E0228 04:09:39.421665 5072 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.5:6443: connect: connection refused" node="crc" Feb 28 04:09:39 crc kubenswrapper[5072]: I0228 04:09:39.590780 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Feb 28 04:09:39 crc kubenswrapper[5072]: W0228 04:09:39.653794 5072 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Feb 28 04:09:39 crc kubenswrapper[5072]: E0228 04:09:39.653903 5072 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Feb 28 04:09:39 crc kubenswrapper[5072]: I0228 04:09:39.662247 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4e285869f66fd2747b0891130fafd5d0bda7a8a1ca8721810b2725f375f92275"} Feb 28 04:09:39 crc kubenswrapper[5072]: I0228 04:09:39.663178 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c291532391d00ffe3d8b136a6cc2db02ee56d212584651b269cc2ca5ac473691"} Feb 28 04:09:39 crc kubenswrapper[5072]: I0228 04:09:39.664143 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1644f3a55d8634f8d06e71ad9d71f250f268bc1b43b398623032748efdee9c6a"} Feb 28 04:09:39 crc kubenswrapper[5072]: I0228 04:09:39.665436 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0c5ae934bb8f7a64ddd863e64b7537d7eda46ece3089fba7b9bc592b19d68ae9"} Feb 28 04:09:39 crc kubenswrapper[5072]: I0228 04:09:39.666345 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4497a7c0b05ffd7f5b4e8dd7ad708a9ede4f7289a230f36e9de2509045253d5f"} Feb 28 04:09:39 crc kubenswrapper[5072]: W0228 04:09:39.822562 5072 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Feb 28 04:09:39 crc kubenswrapper[5072]: E0228 04:09:39.822670 5072 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Feb 28 04:09:40 crc kubenswrapper[5072]: W0228 04:09:39.879917 5072 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Feb 28 04:09:40 crc kubenswrapper[5072]: E0228 04:09:39.880001 5072 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.222758 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:40 crc kubenswrapper[5072]: E0228 04:09:40.561211 5072 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" interval="1.6s" Feb 28 04:09:40 crc kubenswrapper[5072]: W0228 04:09:40.561282 5072 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Feb 28 04:09:40 crc kubenswrapper[5072]: E0228 04:09:40.561399 5072 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.565402 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.565511 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.565540 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.566323 5072 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 04:09:40 crc kubenswrapper[5072]: E0228 04:09:40.567397 5072 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.5:6443: connect: connection refused" node="crc" Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.590855 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.640117 5072 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 28 04:09:40 crc kubenswrapper[5072]: E0228 04:09:40.641325 5072 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.671986 5072 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3" exitCode=0 Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.672085 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3"} Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.672156 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.673123 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.673152 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.673165 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.674523 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1950822dc64ce3c84b441fc6dc100ebf4d9669049c08f8c0ef21e6eb30f136bb"} Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.674566 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c229773678d8bee8898338eab595e8a47dba68ae424e54532ffc8eab7f077005"} Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.677948 5072 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4" exitCode=0 Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.677990 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4"} Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.678356 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.679614 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.679682 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.679700 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.680359 5072 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6" exitCode=0 Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.680414 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6"} Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.680504 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.681542 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.681589 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.681610 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.681962 5072 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="740a5600b9eed7a432dcc447c5a49b2a454afce62216d109e2835696c1adee56" exitCode=0 Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.682007 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"740a5600b9eed7a432dcc447c5a49b2a454afce62216d109e2835696c1adee56"} Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.682019 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.683219 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.683281 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.683302 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.684669 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.685494 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.685539 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:40 crc kubenswrapper[5072]: I0228 04:09:40.685555 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:41 crc kubenswrapper[5072]: I0228 04:09:41.591142 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Feb 28 04:09:41 crc kubenswrapper[5072]: I0228 04:09:41.688357 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c20aeec9468a4876f5db381201080c44af00d4d5f79f73541faec3a8c7e5dd7f"} Feb 28 04:09:41 crc kubenswrapper[5072]: I0228 04:09:41.691322 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50"} Feb 28 04:09:41 crc kubenswrapper[5072]: I0228 04:09:41.693315 5072 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e" exitCode=0 Feb 28 04:09:41 crc kubenswrapper[5072]: I0228 04:09:41.693419 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:41 crc kubenswrapper[5072]: I0228 04:09:41.693412 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e"} Feb 28 04:09:41 crc kubenswrapper[5072]: I0228 04:09:41.694216 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:41 crc kubenswrapper[5072]: I0228 04:09:41.694238 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:41 crc kubenswrapper[5072]: I0228 04:09:41.694248 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:41 crc kubenswrapper[5072]: I0228 04:09:41.695546 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c7779f536e33b45272502b7cdffb80567981b7ea7bb007f18e01eef6689b64a9"} Feb 28 04:09:41 crc kubenswrapper[5072]: I0228 04:09:41.695585 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:41 crc kubenswrapper[5072]: I0228 04:09:41.696251 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:41 crc kubenswrapper[5072]: I0228 04:09:41.696276 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:41 crc kubenswrapper[5072]: I0228 04:09:41.696287 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:41 crc kubenswrapper[5072]: I0228 04:09:41.697720 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a8fcf3c84cc8d43452422396d9f53c179f0264e124ebbfff139954a568b40532"} Feb 28 04:09:42 crc kubenswrapper[5072]: E0228 04:09:42.163040 5072 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" interval="3.2s" Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.168095 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.169718 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.169786 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.169804 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.169848 5072 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 04:09:42 crc kubenswrapper[5072]: E0228 04:09:42.170529 5072 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.5:6443: connect: connection refused" node="crc" Feb 28 04:09:42 crc kubenswrapper[5072]: W0228 04:09:42.257533 5072 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Feb 28 04:09:42 crc kubenswrapper[5072]: E0228 04:09:42.257632 5072 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Feb 28 04:09:42 crc kubenswrapper[5072]: W0228 04:09:42.417787 5072 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Feb 28 04:09:42 crc kubenswrapper[5072]: E0228 04:09:42.417899 5072 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.591061 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.717160 5072 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d" exitCode=0 Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.717263 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d"} Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.717311 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.718678 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.718733 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.718748 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.721284 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.721269 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"577dc47d1b2ce12222648c5ba704654974f41d842e674ffd3650d1e5b44dafc5"} Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.721423 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"55beeaa697034d33a24c08794ac6fa9c624f2af676d99123431005b3cad00a32"} Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.722055 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.722083 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.722092 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.725764 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b3fdbc0636689b1c505ce29b494474b17f8d7b1a1c40ae4a7df22a4e0e43bfca"} Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.725937 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.727118 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.727150 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.727160 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.728849 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ee6dd2cda84095a6efa21fc5da187b69b24039aba2b772cff0c334bf312f0916"} Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.728880 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.728893 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392"} Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.728910 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb"} Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.728920 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998"} Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.728941 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.729541 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.729566 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.729577 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.729597 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.729617 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:42 crc kubenswrapper[5072]: I0228 04:09:42.729628 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:42 crc kubenswrapper[5072]: W0228 04:09:42.793102 5072 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Feb 28 04:09:42 crc kubenswrapper[5072]: E0228 04:09:42.793205 5072 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Feb 28 04:09:42 crc kubenswrapper[5072]: W0228 04:09:42.933524 5072 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Feb 28 04:09:42 crc kubenswrapper[5072]: E0228 04:09:42.933622 5072 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Feb 28 04:09:43 crc kubenswrapper[5072]: I0228 04:09:43.182011 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 28 04:09:43 crc kubenswrapper[5072]: I0228 04:09:43.605742 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:09:43 crc kubenswrapper[5072]: I0228 04:09:43.744994 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3f1ebfc293bb64884104f802401ae8cbd269c8237c7b152b1fb1832f631c8f8a"} Feb 28 04:09:43 crc kubenswrapper[5072]: I0228 04:09:43.745053 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"81de0063ef550336022938bbee3deee5c85bd71d0fe56ebfb0cfce30359bebc4"} Feb 28 04:09:43 crc kubenswrapper[5072]: I0228 04:09:43.745066 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8147830d1b3f4a182599f91c8c22e3a576b0eb490abb717fdb3f5d8935bed72d"} Feb 28 04:09:43 crc kubenswrapper[5072]: I0228 04:09:43.745075 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f7114e25f3e3e183fca6beb9468507c6d3276500896964a4446b4a00015c747d"} Feb 28 04:09:43 crc kubenswrapper[5072]: I0228 04:09:43.745192 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:43 crc kubenswrapper[5072]: I0228 04:09:43.745230 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:43 crc kubenswrapper[5072]: I0228 04:09:43.746333 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:43 crc kubenswrapper[5072]: I0228 04:09:43.746517 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:09:43 crc kubenswrapper[5072]: I0228 04:09:43.746727 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:43 crc kubenswrapper[5072]: I0228 04:09:43.746770 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:43 crc kubenswrapper[5072]: I0228 04:09:43.746805 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:43 crc kubenswrapper[5072]: I0228 04:09:43.746822 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:43 crc kubenswrapper[5072]: I0228 04:09:43.746779 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:43 crc kubenswrapper[5072]: I0228 04:09:43.746893 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:43 crc kubenswrapper[5072]: I0228 04:09:43.747229 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:43 crc kubenswrapper[5072]: I0228 04:09:43.747292 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:43 crc kubenswrapper[5072]: I0228 04:09:43.747323 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:44 crc kubenswrapper[5072]: I0228 04:09:44.751957 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"633c33aa1dff5f7df880dc47e4a688c1dd56705f9b3eda53cb1e830e91cfa791"} Feb 28 04:09:44 crc kubenswrapper[5072]: I0228 04:09:44.752046 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:44 crc kubenswrapper[5072]: I0228 04:09:44.752127 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:44 crc kubenswrapper[5072]: I0228 04:09:44.752130 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:44 crc kubenswrapper[5072]: I0228 04:09:44.753064 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:44 crc kubenswrapper[5072]: I0228 04:09:44.753106 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:44 crc kubenswrapper[5072]: I0228 04:09:44.753065 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:44 crc kubenswrapper[5072]: I0228 04:09:44.753146 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:44 crc kubenswrapper[5072]: I0228 04:09:44.753168 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:44 crc kubenswrapper[5072]: I0228 04:09:44.753117 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:44 crc kubenswrapper[5072]: I0228 04:09:44.753610 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:44 crc kubenswrapper[5072]: I0228 04:09:44.753654 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:44 crc kubenswrapper[5072]: I0228 04:09:44.753664 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:44 crc kubenswrapper[5072]: I0228 04:09:44.961001 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 04:09:44 crc kubenswrapper[5072]: I0228 04:09:44.961289 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:44 crc kubenswrapper[5072]: I0228 04:09:44.963378 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:44 crc kubenswrapper[5072]: I0228 04:09:44.963439 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:44 crc kubenswrapper[5072]: I0228 04:09:44.963453 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:44 crc kubenswrapper[5072]: I0228 04:09:44.983699 5072 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 28 04:09:45 crc kubenswrapper[5072]: I0228 04:09:45.371247 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:45 crc kubenswrapper[5072]: I0228 04:09:45.373240 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:45 crc kubenswrapper[5072]: I0228 04:09:45.373302 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:45 crc kubenswrapper[5072]: I0228 04:09:45.373324 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:45 crc kubenswrapper[5072]: I0228 04:09:45.373367 5072 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 04:09:45 crc kubenswrapper[5072]: I0228 04:09:45.755457 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:45 crc kubenswrapper[5072]: I0228 04:09:45.756587 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:45 crc kubenswrapper[5072]: I0228 04:09:45.756680 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:45 crc kubenswrapper[5072]: I0228 04:09:45.756707 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:46 crc kubenswrapper[5072]: I0228 04:09:46.915170 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:09:46 crc kubenswrapper[5072]: I0228 04:09:46.915546 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:46 crc kubenswrapper[5072]: I0228 04:09:46.917257 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:46 crc kubenswrapper[5072]: I0228 04:09:46.917301 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:46 crc kubenswrapper[5072]: I0228 04:09:46.917319 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:47 crc kubenswrapper[5072]: I0228 04:09:47.355349 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 28 04:09:47 crc kubenswrapper[5072]: I0228 04:09:47.355586 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:47 crc kubenswrapper[5072]: I0228 04:09:47.357173 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:47 crc kubenswrapper[5072]: I0228 04:09:47.357235 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:47 crc kubenswrapper[5072]: I0228 04:09:47.357249 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:47 crc kubenswrapper[5072]: I0228 04:09:47.961219 5072 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 28 04:09:47 crc kubenswrapper[5072]: I0228 04:09:47.961349 5072 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 28 04:09:48 crc kubenswrapper[5072]: E0228 04:09:48.724036 5072 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 28 04:09:50 crc kubenswrapper[5072]: I0228 04:09:50.560855 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 04:09:50 crc kubenswrapper[5072]: I0228 04:09:50.561144 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:50 crc kubenswrapper[5072]: I0228 04:09:50.562947 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:50 crc kubenswrapper[5072]: I0228 04:09:50.563011 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:50 crc kubenswrapper[5072]: I0228 04:09:50.563026 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:51 crc kubenswrapper[5072]: I0228 04:09:51.406205 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 04:09:51 crc kubenswrapper[5072]: I0228 04:09:51.406511 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:51 crc kubenswrapper[5072]: I0228 04:09:51.408599 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:51 crc kubenswrapper[5072]: I0228 04:09:51.408698 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:51 crc kubenswrapper[5072]: I0228 04:09:51.408713 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:51 crc kubenswrapper[5072]: I0228 04:09:51.410723 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 04:09:51 crc kubenswrapper[5072]: I0228 04:09:51.591365 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 04:09:51 crc kubenswrapper[5072]: I0228 04:09:51.757466 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 28 04:09:51 crc kubenswrapper[5072]: I0228 04:09:51.757757 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:51 crc kubenswrapper[5072]: I0228 04:09:51.759523 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:51 crc kubenswrapper[5072]: I0228 04:09:51.759576 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:51 crc kubenswrapper[5072]: I0228 04:09:51.759595 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:51 crc kubenswrapper[5072]: I0228 04:09:51.773539 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:51 crc kubenswrapper[5072]: I0228 04:09:51.774940 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:51 crc kubenswrapper[5072]: I0228 04:09:51.774993 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:51 crc kubenswrapper[5072]: I0228 04:09:51.775013 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:51 crc kubenswrapper[5072]: I0228 04:09:51.780511 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 04:09:52 crc kubenswrapper[5072]: I0228 04:09:52.776198 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:52 crc kubenswrapper[5072]: I0228 04:09:52.777381 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:52 crc kubenswrapper[5072]: I0228 04:09:52.777432 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:52 crc kubenswrapper[5072]: I0228 04:09:52.777445 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:53 crc kubenswrapper[5072]: I0228 04:09:53.272414 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:09:53Z is after 2026-02-23T05:33:13Z Feb 28 04:09:53 crc kubenswrapper[5072]: W0228 04:09:53.276705 5072 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:09:53Z is after 2026-02-23T05:33:13Z Feb 28 04:09:53 crc kubenswrapper[5072]: E0228 04:09:53.276846 5072 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:09:53Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 28 04:09:53 crc kubenswrapper[5072]: I0228 04:09:53.279717 5072 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 28 04:09:53 crc kubenswrapper[5072]: I0228 04:09:53.279795 5072 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 28 04:09:53 crc kubenswrapper[5072]: W0228 04:09:53.280664 5072 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:09:53Z is after 2026-02-23T05:33:13Z Feb 28 04:09:53 crc kubenswrapper[5072]: E0228 04:09:53.280801 5072 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:09:53Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 28 04:09:53 crc kubenswrapper[5072]: W0228 04:09:53.283921 5072 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:09:53Z is after 2026-02-23T05:33:13Z Feb 28 04:09:53 crc kubenswrapper[5072]: E0228 04:09:53.284012 5072 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:09:53Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 28 04:09:53 crc kubenswrapper[5072]: I0228 04:09:53.285004 5072 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 28 04:09:53 crc kubenswrapper[5072]: I0228 04:09:53.285100 5072 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 28 04:09:53 crc kubenswrapper[5072]: W0228 04:09:53.285556 5072 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:09:53Z is after 2026-02-23T05:33:13Z Feb 28 04:09:53 crc kubenswrapper[5072]: E0228 04:09:53.285673 5072 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:09:53Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 28 04:09:53 crc kubenswrapper[5072]: E0228 04:09:53.287214 5072 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:09:53Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.18984d9a533a6136 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:38.586927414 +0000 UTC m=+0.581657606,LastTimestamp:2026-02-28 04:09:38.586927414 +0000 UTC m=+0.581657606,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:09:53 crc kubenswrapper[5072]: E0228 04:09:53.287938 5072 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:09:53Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 28 04:09:53 crc kubenswrapper[5072]: E0228 04:09:53.292789 5072 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:09:53Z is after 2026-02-23T05:33:13Z" interval="6.4s" Feb 28 04:09:53 crc kubenswrapper[5072]: E0228 04:09:53.295729 5072 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:09:53Z is after 2026-02-23T05:33:13Z" node="crc" Feb 28 04:09:53 crc kubenswrapper[5072]: I0228 04:09:53.593823 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:09:53Z is after 2026-02-23T05:33:13Z Feb 28 04:09:53 crc kubenswrapper[5072]: I0228 04:09:53.781363 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 28 04:09:53 crc kubenswrapper[5072]: I0228 04:09:53.783325 5072 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ee6dd2cda84095a6efa21fc5da187b69b24039aba2b772cff0c334bf312f0916" exitCode=255 Feb 28 04:09:53 crc kubenswrapper[5072]: I0228 04:09:53.783353 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ee6dd2cda84095a6efa21fc5da187b69b24039aba2b772cff0c334bf312f0916"} Feb 28 04:09:53 crc kubenswrapper[5072]: I0228 04:09:53.783506 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:53 crc kubenswrapper[5072]: I0228 04:09:53.783526 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:53 crc kubenswrapper[5072]: I0228 04:09:53.784407 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:53 crc kubenswrapper[5072]: I0228 04:09:53.784442 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:53 crc kubenswrapper[5072]: I0228 04:09:53.784451 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:53 crc kubenswrapper[5072]: I0228 04:09:53.784454 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:53 crc kubenswrapper[5072]: I0228 04:09:53.784476 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:53 crc kubenswrapper[5072]: I0228 04:09:53.784510 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:53 crc kubenswrapper[5072]: I0228 04:09:53.784948 5072 scope.go:117] "RemoveContainer" containerID="ee6dd2cda84095a6efa21fc5da187b69b24039aba2b772cff0c334bf312f0916" Feb 28 04:09:54 crc kubenswrapper[5072]: I0228 04:09:54.594422 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:09:54Z is after 2026-02-23T05:33:13Z Feb 28 04:09:54 crc kubenswrapper[5072]: I0228 04:09:54.790205 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 28 04:09:54 crc kubenswrapper[5072]: I0228 04:09:54.793128 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f68a4c5e30201dfd85006a7d26b11b4b6924210802315fc48c69f51cb66c0b7b"} Feb 28 04:09:54 crc kubenswrapper[5072]: I0228 04:09:54.793307 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:54 crc kubenswrapper[5072]: I0228 04:09:54.794591 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:54 crc kubenswrapper[5072]: I0228 04:09:54.794693 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:54 crc kubenswrapper[5072]: I0228 04:09:54.794710 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:55 crc kubenswrapper[5072]: I0228 04:09:55.594006 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:09:55Z is after 2026-02-23T05:33:13Z Feb 28 04:09:55 crc kubenswrapper[5072]: I0228 04:09:55.799945 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 28 04:09:55 crc kubenswrapper[5072]: I0228 04:09:55.800552 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 28 04:09:55 crc kubenswrapper[5072]: I0228 04:09:55.803027 5072 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f68a4c5e30201dfd85006a7d26b11b4b6924210802315fc48c69f51cb66c0b7b" exitCode=255 Feb 28 04:09:55 crc kubenswrapper[5072]: I0228 04:09:55.803087 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f68a4c5e30201dfd85006a7d26b11b4b6924210802315fc48c69f51cb66c0b7b"} Feb 28 04:09:55 crc kubenswrapper[5072]: I0228 04:09:55.803162 5072 scope.go:117] "RemoveContainer" containerID="ee6dd2cda84095a6efa21fc5da187b69b24039aba2b772cff0c334bf312f0916" Feb 28 04:09:55 crc kubenswrapper[5072]: I0228 04:09:55.803312 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:55 crc kubenswrapper[5072]: I0228 04:09:55.804348 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:55 crc kubenswrapper[5072]: I0228 04:09:55.804408 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:55 crc kubenswrapper[5072]: I0228 04:09:55.804430 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:55 crc kubenswrapper[5072]: I0228 04:09:55.805193 5072 scope.go:117] "RemoveContainer" containerID="f68a4c5e30201dfd85006a7d26b11b4b6924210802315fc48c69f51cb66c0b7b" Feb 28 04:09:55 crc kubenswrapper[5072]: E0228 04:09:55.805514 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 04:09:55 crc kubenswrapper[5072]: I0228 04:09:55.951417 5072 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:09:56 crc kubenswrapper[5072]: I0228 04:09:56.593460 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:09:56Z is after 2026-02-23T05:33:13Z Feb 28 04:09:56 crc kubenswrapper[5072]: I0228 04:09:56.809041 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 28 04:09:56 crc kubenswrapper[5072]: I0228 04:09:56.812463 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:56 crc kubenswrapper[5072]: I0228 04:09:56.813626 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:56 crc kubenswrapper[5072]: I0228 04:09:56.813667 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:56 crc kubenswrapper[5072]: I0228 04:09:56.813677 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:56 crc kubenswrapper[5072]: I0228 04:09:56.814211 5072 scope.go:117] "RemoveContainer" containerID="f68a4c5e30201dfd85006a7d26b11b4b6924210802315fc48c69f51cb66c0b7b" Feb 28 04:09:56 crc kubenswrapper[5072]: E0228 04:09:56.814379 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 04:09:56 crc kubenswrapper[5072]: I0228 04:09:56.921458 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:09:57 crc kubenswrapper[5072]: I0228 04:09:57.593045 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:09:57Z is after 2026-02-23T05:33:13Z Feb 28 04:09:57 crc kubenswrapper[5072]: I0228 04:09:57.814791 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:57 crc kubenswrapper[5072]: I0228 04:09:57.815606 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:57 crc kubenswrapper[5072]: I0228 04:09:57.815634 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:57 crc kubenswrapper[5072]: I0228 04:09:57.815665 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:57 crc kubenswrapper[5072]: I0228 04:09:57.816254 5072 scope.go:117] "RemoveContainer" containerID="f68a4c5e30201dfd85006a7d26b11b4b6924210802315fc48c69f51cb66c0b7b" Feb 28 04:09:57 crc kubenswrapper[5072]: E0228 04:09:57.816440 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 04:09:57 crc kubenswrapper[5072]: I0228 04:09:57.820258 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:09:57 crc kubenswrapper[5072]: I0228 04:09:57.962177 5072 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 28 04:09:57 crc kubenswrapper[5072]: I0228 04:09:57.962262 5072 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 28 04:09:58 crc kubenswrapper[5072]: I0228 04:09:58.596247 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:09:58Z is after 2026-02-23T05:33:13Z Feb 28 04:09:58 crc kubenswrapper[5072]: E0228 04:09:58.724214 5072 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 28 04:09:58 crc kubenswrapper[5072]: I0228 04:09:58.817673 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:58 crc kubenswrapper[5072]: I0228 04:09:58.819244 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:58 crc kubenswrapper[5072]: I0228 04:09:58.819289 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:58 crc kubenswrapper[5072]: I0228 04:09:58.819303 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:58 crc kubenswrapper[5072]: I0228 04:09:58.819992 5072 scope.go:117] "RemoveContainer" containerID="f68a4c5e30201dfd85006a7d26b11b4b6924210802315fc48c69f51cb66c0b7b" Feb 28 04:09:58 crc kubenswrapper[5072]: E0228 04:09:58.820272 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 04:09:59 crc kubenswrapper[5072]: I0228 04:09:59.597003 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:09:59Z is after 2026-02-23T05:33:13Z Feb 28 04:09:59 crc kubenswrapper[5072]: I0228 04:09:59.695882 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:09:59 crc kubenswrapper[5072]: I0228 04:09:59.697779 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:09:59 crc kubenswrapper[5072]: I0228 04:09:59.697813 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:09:59 crc kubenswrapper[5072]: I0228 04:09:59.697828 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:09:59 crc kubenswrapper[5072]: I0228 04:09:59.697862 5072 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 04:09:59 crc kubenswrapper[5072]: E0228 04:09:59.698927 5072 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:09:59Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 28 04:09:59 crc kubenswrapper[5072]: E0228 04:09:59.700954 5072 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:09:59Z is after 2026-02-23T05:33:13Z" node="crc" Feb 28 04:10:00 crc kubenswrapper[5072]: W0228 04:10:00.073497 5072 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Feb 28 04:10:00 crc kubenswrapper[5072]: E0228 04:10:00.073582 5072 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 28 04:10:00 crc kubenswrapper[5072]: I0228 04:10:00.602207 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:00 crc kubenswrapper[5072]: I0228 04:10:00.640597 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:10:00 crc kubenswrapper[5072]: I0228 04:10:00.640945 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:10:00 crc kubenswrapper[5072]: I0228 04:10:00.642297 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:00 crc kubenswrapper[5072]: I0228 04:10:00.642342 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:00 crc kubenswrapper[5072]: I0228 04:10:00.642356 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:00 crc kubenswrapper[5072]: I0228 04:10:00.643028 5072 scope.go:117] "RemoveContainer" containerID="f68a4c5e30201dfd85006a7d26b11b4b6924210802315fc48c69f51cb66c0b7b" Feb 28 04:10:00 crc kubenswrapper[5072]: E0228 04:10:00.643236 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 04:10:00 crc kubenswrapper[5072]: W0228 04:10:00.910279 5072 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:00 crc kubenswrapper[5072]: E0228 04:10:00.910354 5072 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 28 04:10:01 crc kubenswrapper[5072]: I0228 04:10:01.596153 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:01 crc kubenswrapper[5072]: I0228 04:10:01.634561 5072 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 28 04:10:01 crc kubenswrapper[5072]: I0228 04:10:01.652909 5072 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 28 04:10:01 crc kubenswrapper[5072]: I0228 04:10:01.789776 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 28 04:10:01 crc kubenswrapper[5072]: I0228 04:10:01.790009 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:10:01 crc kubenswrapper[5072]: I0228 04:10:01.791187 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:01 crc kubenswrapper[5072]: I0228 04:10:01.791270 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:01 crc kubenswrapper[5072]: I0228 04:10:01.791294 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:01 crc kubenswrapper[5072]: I0228 04:10:01.803218 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 28 04:10:01 crc kubenswrapper[5072]: I0228 04:10:01.826326 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:10:01 crc kubenswrapper[5072]: I0228 04:10:01.827323 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:01 crc kubenswrapper[5072]: I0228 04:10:01.827368 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:01 crc kubenswrapper[5072]: I0228 04:10:01.827384 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:02 crc kubenswrapper[5072]: I0228 04:10:02.594959 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.292120 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984d9a533a6136 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:38.586927414 +0000 UTC m=+0.581657606,LastTimestamp:2026-02-28 04:09:38.586927414 +0000 UTC m=+0.581657606,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.296900 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984d9a5655e07e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:38.639061118 +0000 UTC m=+0.633791310,LastTimestamp:2026-02-28 04:09:38.639061118 +0000 UTC m=+0.633791310,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.300633 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984d9a56562ffc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:38.639081468 +0000 UTC m=+0.633811660,LastTimestamp:2026-02-28 04:09:38.639081468 +0000 UTC m=+0.633811660,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.304456 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984d9a56565929 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:38.639092009 +0000 UTC m=+0.633822201,LastTimestamp:2026-02-28 04:09:38.639092009 +0000 UTC m=+0.633822201,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.308569 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984d9a5af0e090 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:38.71632808 +0000 UTC m=+0.711058262,LastTimestamp:2026-02-28 04:09:38.71632808 +0000 UTC m=+0.711058262,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.313116 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984d9a5655e07e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984d9a5655e07e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:38.639061118 +0000 UTC m=+0.633791310,LastTimestamp:2026-02-28 04:09:38.759870455 +0000 UTC m=+0.754600647,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.315769 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984d9a56562ffc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984d9a56562ffc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:38.639081468 +0000 UTC m=+0.633811660,LastTimestamp:2026-02-28 04:09:38.759890746 +0000 UTC m=+0.754620938,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.316946 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984d9a56565929\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984d9a56565929 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:38.639092009 +0000 UTC m=+0.633822201,LastTimestamp:2026-02-28 04:09:38.759899076 +0000 UTC m=+0.754629268,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.319211 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984d9a5655e07e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984d9a5655e07e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:38.639061118 +0000 UTC m=+0.633791310,LastTimestamp:2026-02-28 04:09:38.760816624 +0000 UTC m=+0.755546816,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.320621 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984d9a5655e07e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984d9a5655e07e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:38.639061118 +0000 UTC m=+0.633791310,LastTimestamp:2026-02-28 04:09:38.760826454 +0000 UTC m=+0.755556646,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.323970 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984d9a56562ffc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984d9a56562ffc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:38.639081468 +0000 UTC m=+0.633811660,LastTimestamp:2026-02-28 04:09:38.760830455 +0000 UTC m=+0.755560647,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.327862 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984d9a56562ffc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984d9a56562ffc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:38.639081468 +0000 UTC m=+0.633811660,LastTimestamp:2026-02-28 04:09:38.760840665 +0000 UTC m=+0.755570857,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.331824 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984d9a56565929\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984d9a56565929 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:38.639092009 +0000 UTC m=+0.633822201,LastTimestamp:2026-02-28 04:09:38.760849235 +0000 UTC m=+0.755579427,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.335592 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984d9a56565929\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984d9a56565929 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:38.639092009 +0000 UTC m=+0.633822201,LastTimestamp:2026-02-28 04:09:38.760858095 +0000 UTC m=+0.755588287,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.339541 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984d9a5655e07e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984d9a5655e07e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:38.639061118 +0000 UTC m=+0.633791310,LastTimestamp:2026-02-28 04:09:38.761802475 +0000 UTC m=+0.756532667,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.343711 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984d9a56562ffc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984d9a56562ffc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:38.639081468 +0000 UTC m=+0.633811660,LastTimestamp:2026-02-28 04:09:38.761824536 +0000 UTC m=+0.756554728,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.347170 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984d9a56565929\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984d9a56565929 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:38.639092009 +0000 UTC m=+0.633822201,LastTimestamp:2026-02-28 04:09:38.761836226 +0000 UTC m=+0.756566408,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.350931 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984d9a5655e07e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984d9a5655e07e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:38.639061118 +0000 UTC m=+0.633791310,LastTimestamp:2026-02-28 04:09:38.762999783 +0000 UTC m=+0.757729975,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.354365 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984d9a56562ffc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984d9a56562ffc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:38.639081468 +0000 UTC m=+0.633811660,LastTimestamp:2026-02-28 04:09:38.763019184 +0000 UTC m=+0.757749376,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.359173 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984d9a56565929\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984d9a56565929 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:38.639092009 +0000 UTC m=+0.633822201,LastTimestamp:2026-02-28 04:09:38.763029414 +0000 UTC m=+0.757759606,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.362463 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984d9a5655e07e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984d9a5655e07e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:38.639061118 +0000 UTC m=+0.633791310,LastTimestamp:2026-02-28 04:09:38.763093066 +0000 UTC m=+0.757823258,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.366858 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984d9a56562ffc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984d9a56562ffc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:38.639081468 +0000 UTC m=+0.633811660,LastTimestamp:2026-02-28 04:09:38.763111217 +0000 UTC m=+0.757841409,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.370526 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984d9a56565929\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984d9a56565929 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:38.639092009 +0000 UTC m=+0.633822201,LastTimestamp:2026-02-28 04:09:38.763121607 +0000 UTC m=+0.757851809,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.374200 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984d9a5655e07e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984d9a5655e07e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:38.639061118 +0000 UTC m=+0.633791310,LastTimestamp:2026-02-28 04:09:38.763133418 +0000 UTC m=+0.757863600,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.378816 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984d9a56562ffc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984d9a56562ffc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:38.639081468 +0000 UTC m=+0.633811660,LastTimestamp:2026-02-28 04:09:38.763143158 +0000 UTC m=+0.757873350,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.383890 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18984d9a7424deaf openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:39.139165871 +0000 UTC m=+1.133896053,LastTimestamp:2026-02-28 04:09:39.139165871 +0000 UTC m=+1.133896053,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.387862 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984d9a742500d3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:39.139174611 +0000 UTC m=+1.133904803,LastTimestamp:2026-02-28 04:09:39.139174611 +0000 UTC m=+1.133904803,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.391715 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984d9a74920670 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:39.146319472 +0000 UTC m=+1.141049664,LastTimestamp:2026-02-28 04:09:39.146319472 +0000 UTC m=+1.141049664,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.395341 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984d9a757a0f76 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:39.161526134 +0000 UTC m=+1.156256326,LastTimestamp:2026-02-28 04:09:39.161526134 +0000 UTC m=+1.156256326,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.399728 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18984d9a765bb3c4 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:39.176313796 +0000 UTC m=+1.171043988,LastTimestamp:2026-02-28 04:09:39.176313796 +0000 UTC m=+1.171043988,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.405023 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18984d9a98cf9486 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:39.754333318 +0000 UTC m=+1.749063510,LastTimestamp:2026-02-28 04:09:39.754333318 +0000 UTC m=+1.749063510,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.408893 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984d9a98cff8e0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:39.754359008 +0000 UTC m=+1.749089200,LastTimestamp:2026-02-28 04:09:39.754359008 +0000 UTC m=+1.749089200,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.412916 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984d9a9961c250 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:39.763913296 +0000 UTC m=+1.758643498,LastTimestamp:2026-02-28 04:09:39.763913296 +0000 UTC m=+1.758643498,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.417319 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18984d9a998be495 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:39.766674581 +0000 UTC m=+1.761404773,LastTimestamp:2026-02-28 04:09:39.766674581 +0000 UTC m=+1.761404773,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.421764 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984d9a998ef716 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:39.766875926 +0000 UTC m=+1.761606118,LastTimestamp:2026-02-28 04:09:39.766875926 +0000 UTC m=+1.761606118,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.426745 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18984d9a998fe07e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:39.766935678 +0000 UTC m=+1.761665870,LastTimestamp:2026-02-28 04:09:39.766935678 +0000 UTC m=+1.761665870,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.431079 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984d9a99b9a2fb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:39.769672443 +0000 UTC m=+1.764402635,LastTimestamp:2026-02-28 04:09:39.769672443 +0000 UTC m=+1.764402635,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.435289 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984d9a99f0ec21 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:39.773295649 +0000 UTC m=+1.768025841,LastTimestamp:2026-02-28 04:09:39.773295649 +0000 UTC m=+1.768025841,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.439704 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984d9a9a37bc56 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:39.77793647 +0000 UTC m=+1.772666662,LastTimestamp:2026-02-28 04:09:39.77793647 +0000 UTC m=+1.772666662,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.444756 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18984d9a9a71976e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:39.78172811 +0000 UTC m=+1.776458312,LastTimestamp:2026-02-28 04:09:39.78172811 +0000 UTC m=+1.776458312,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.448705 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984d9a9a99e22b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:39.784368683 +0000 UTC m=+1.779098875,LastTimestamp:2026-02-28 04:09:39.784368683 +0000 UTC m=+1.779098875,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.452355 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984d9acc88a302 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:40.622099202 +0000 UTC m=+2.616829394,LastTimestamp:2026-02-28 04:09:40.622099202 +0000 UTC m=+2.616829394,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.455677 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984d9acd218304 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:40.63211802 +0000 UTC m=+2.626848212,LastTimestamp:2026-02-28 04:09:40.63211802 +0000 UTC m=+2.626848212,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.459200 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984d9acd49adcd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:40.634750413 +0000 UTC m=+2.629480605,LastTimestamp:2026-02-28 04:09:40.634750413 +0000 UTC m=+2.629480605,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.463092 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18984d9acfb20cde openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:40.675144926 +0000 UTC m=+2.669875118,LastTimestamp:2026-02-28 04:09:40.675144926 +0000 UTC m=+2.669875118,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.467301 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984d9ad03f0133 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:40.684382515 +0000 UTC m=+2.679112707,LastTimestamp:2026-02-28 04:09:40.684382515 +0000 UTC m=+2.679112707,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.470993 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984d9ad04d79fa openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:40.685330938 +0000 UTC m=+2.680061130,LastTimestamp:2026-02-28 04:09:40.685330938 +0000 UTC m=+2.680061130,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.474476 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18984d9ad04f5e2b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:40.685454891 +0000 UTC m=+2.680185083,LastTimestamp:2026-02-28 04:09:40.685454891 +0000 UTC m=+2.680185083,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.478391 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984d9b039d4fd8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:41.546201048 +0000 UTC m=+3.540931240,LastTimestamp:2026-02-28 04:09:41.546201048 +0000 UTC m=+3.540931240,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.481764 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18984d9b0487eca2 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:41.56157661 +0000 UTC m=+3.556306802,LastTimestamp:2026-02-28 04:09:41.56157661 +0000 UTC m=+3.556306802,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.485120 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984d9b0500d367 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:41.569500007 +0000 UTC m=+3.564230199,LastTimestamp:2026-02-28 04:09:41.569500007 +0000 UTC m=+3.564230199,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.488666 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984d9b051acbd4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:41.571202004 +0000 UTC m=+3.565932196,LastTimestamp:2026-02-28 04:09:41.571202004 +0000 UTC m=+3.565932196,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.492428 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18984d9b055b1865 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:41.575415909 +0000 UTC m=+3.570146101,LastTimestamp:2026-02-28 04:09:41.575415909 +0000 UTC m=+3.570146101,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.496042 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18984d9b057978d1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:41.577406673 +0000 UTC m=+3.572136865,LastTimestamp:2026-02-28 04:09:41.577406673 +0000 UTC m=+3.572136865,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.499803 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18984d9b058bc463 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:41.578605667 +0000 UTC m=+3.573335859,LastTimestamp:2026-02-28 04:09:41.578605667 +0000 UTC m=+3.573335859,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.503096 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984d9b063042c2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:41.589385922 +0000 UTC m=+3.584116114,LastTimestamp:2026-02-28 04:09:41.589385922 +0000 UTC m=+3.584116114,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.506389 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18984d9b075d6ea5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:41.609123493 +0000 UTC m=+3.603853675,LastTimestamp:2026-02-28 04:09:41.609123493 +0000 UTC m=+3.603853675,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.510254 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984d9b07d9b3fa openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:41.617267706 +0000 UTC m=+3.611997898,LastTimestamp:2026-02-28 04:09:41.617267706 +0000 UTC m=+3.611997898,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.513497 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984d9b07e9d488 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:41.618324616 +0000 UTC m=+3.613054808,LastTimestamp:2026-02-28 04:09:41.618324616 +0000 UTC m=+3.613054808,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.516710 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984d9b096de445 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:41.643756613 +0000 UTC m=+3.638486795,LastTimestamp:2026-02-28 04:09:41.643756613 +0000 UTC m=+3.638486795,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.520387 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984d9b0bb9c61b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:41.682284059 +0000 UTC m=+3.677014251,LastTimestamp:2026-02-28 04:09:41.682284059 +0000 UTC m=+3.677014251,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.521491 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984d9b0c8935a1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:41.695878561 +0000 UTC m=+3.690608753,LastTimestamp:2026-02-28 04:09:41.695878561 +0000 UTC m=+3.690608753,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.525164 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984d9b11b29afd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:41.782477565 +0000 UTC m=+3.777207757,LastTimestamp:2026-02-28 04:09:41.782477565 +0000 UTC m=+3.777207757,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.528788 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18984d9b12a775d6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:41.798524374 +0000 UTC m=+3.793254566,LastTimestamp:2026-02-28 04:09:41.798524374 +0000 UTC m=+3.793254566,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.532074 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984d9b12c47f49 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:41.800427337 +0000 UTC m=+3.795157529,LastTimestamp:2026-02-28 04:09:41.800427337 +0000 UTC m=+3.795157529,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.535126 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984d9b135df9ee openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:41.810485742 +0000 UTC m=+3.805215924,LastTimestamp:2026-02-28 04:09:41.810485742 +0000 UTC m=+3.805215924,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.538498 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18984d9b14043f79 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:41.821382521 +0000 UTC m=+3.816112703,LastTimestamp:2026-02-28 04:09:41.821382521 +0000 UTC m=+3.816112703,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.542010 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984d9b1419bd1c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:41.82279094 +0000 UTC m=+3.817521132,LastTimestamp:2026-02-28 04:09:41.82279094 +0000 UTC m=+3.817521132,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.545482 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984d9b14500fbd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:41.826351037 +0000 UTC m=+3.821081229,LastTimestamp:2026-02-28 04:09:41.826351037 +0000 UTC m=+3.821081229,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.549009 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18984d9b1454e6ea openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:41.826668266 +0000 UTC m=+3.821398468,LastTimestamp:2026-02-28 04:09:41.826668266 +0000 UTC m=+3.821398468,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.553004 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984d9b1a99cd07 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:41.931846919 +0000 UTC m=+3.926577111,LastTimestamp:2026-02-28 04:09:41.931846919 +0000 UTC m=+3.926577111,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.556628 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984d9b1c1c4e84 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:41.957176964 +0000 UTC m=+3.951907156,LastTimestamp:2026-02-28 04:09:41.957176964 +0000 UTC m=+3.951907156,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.560622 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18984d9b1e6b6bfd openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:41.995916285 +0000 UTC m=+3.990646467,LastTimestamp:2026-02-28 04:09:41.995916285 +0000 UTC m=+3.990646467,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.564137 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18984d9b1f302400 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:42.008808448 +0000 UTC m=+4.003538640,LastTimestamp:2026-02-28 04:09:42.008808448 +0000 UTC m=+4.003538640,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.567596 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984d9b23aee72f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:42.084224815 +0000 UTC m=+4.078955017,LastTimestamp:2026-02-28 04:09:42.084224815 +0000 UTC m=+4.078955017,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.572501 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984d9b24aa49f5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:42.100699637 +0000 UTC m=+4.095429829,LastTimestamp:2026-02-28 04:09:42.100699637 +0000 UTC m=+4.095429829,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.576192 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984d9b24bf168f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:42.102062735 +0000 UTC m=+4.096792927,LastTimestamp:2026-02-28 04:09:42.102062735 +0000 UTC m=+4.096792927,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.580184 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984d9b31a16e0c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:42.31822286 +0000 UTC m=+4.312953052,LastTimestamp:2026-02-28 04:09:42.31822286 +0000 UTC m=+4.312953052,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.583328 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984d9b326183da openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:42.330811354 +0000 UTC m=+4.325541546,LastTimestamp:2026-02-28 04:09:42.330811354 +0000 UTC m=+4.325541546,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.586959 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984d9b32780e67 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:42.332288615 +0000 UTC m=+4.327018807,LastTimestamp:2026-02-28 04:09:42.332288615 +0000 UTC m=+4.327018807,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: I0228 04:10:03.591658 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.591676 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984d9b3f0f880d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:42.543542285 +0000 UTC m=+4.538272477,LastTimestamp:2026-02-28 04:09:42.543542285 +0000 UTC m=+4.538272477,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.595519 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984d9b3fad51db openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:42.553883099 +0000 UTC m=+4.548613291,LastTimestamp:2026-02-28 04:09:42.553883099 +0000 UTC m=+4.548613291,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.600304 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984d9b499ac43a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:42.720439354 +0000 UTC m=+4.715169546,LastTimestamp:2026-02-28 04:09:42.720439354 +0000 UTC m=+4.715169546,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.604103 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984d9b55af7dca openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:42.92312417 +0000 UTC m=+4.917854352,LastTimestamp:2026-02-28 04:09:42.92312417 +0000 UTC m=+4.917854352,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.607367 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984d9b56595b01 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:42.934256385 +0000 UTC m=+4.928986577,LastTimestamp:2026-02-28 04:09:42.934256385 +0000 UTC m=+4.928986577,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.612410 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984d9b566c19ed openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:42.935484909 +0000 UTC m=+4.930215101,LastTimestamp:2026-02-28 04:09:42.935484909 +0000 UTC m=+4.930215101,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.618888 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984d9b6504f9dc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:43.180384732 +0000 UTC m=+5.175114924,LastTimestamp:2026-02-28 04:09:43.180384732 +0000 UTC m=+5.175114924,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.625684 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984d9b65dc73d7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:43.194506199 +0000 UTC m=+5.189236391,LastTimestamp:2026-02-28 04:09:43.194506199 +0000 UTC m=+5.189236391,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.628871 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984d9b65f0bedd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:43.195836125 +0000 UTC m=+5.190566317,LastTimestamp:2026-02-28 04:09:43.195836125 +0000 UTC m=+5.190566317,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.633294 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984d9b74298fa7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:43.434440615 +0000 UTC m=+5.429170807,LastTimestamp:2026-02-28 04:09:43.434440615 +0000 UTC m=+5.429170807,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.637894 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984d9b74f8ff74 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:43.448035188 +0000 UTC m=+5.442765380,LastTimestamp:2026-02-28 04:09:43.448035188 +0000 UTC m=+5.442765380,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.642729 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984d9b750a9380 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:43.4491872 +0000 UTC m=+5.443917402,LastTimestamp:2026-02-28 04:09:43.4491872 +0000 UTC m=+5.443917402,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.646980 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984d9b81cd1ba3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:43.663262627 +0000 UTC m=+5.657992869,LastTimestamp:2026-02-28 04:09:43.663262627 +0000 UTC m=+5.657992869,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.650489 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984d9b82cc974e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:43.680005966 +0000 UTC m=+5.674736168,LastTimestamp:2026-02-28 04:09:43.680005966 +0000 UTC m=+5.674736168,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.654609 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984d9b82e35cff openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:43.681498367 +0000 UTC m=+5.676228579,LastTimestamp:2026-02-28 04:09:43.681498367 +0000 UTC m=+5.676228579,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.659446 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984d9b8efb949b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:43.884412059 +0000 UTC m=+5.879142261,LastTimestamp:2026-02-28 04:09:43.884412059 +0000 UTC m=+5.879142261,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.663722 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984d9b90030a8e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:43.901678222 +0000 UTC m=+5.896408434,LastTimestamp:2026-02-28 04:09:43.901678222 +0000 UTC m=+5.896408434,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.669136 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 28 04:10:03 crc kubenswrapper[5072]: &Event{ObjectMeta:{kube-controller-manager-crc.18984d9c81fc2843 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Feb 28 04:10:03 crc kubenswrapper[5072]: body: Feb 28 04:10:03 crc kubenswrapper[5072]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:47.961313347 +0000 UTC m=+9.956043539,LastTimestamp:2026-02-28 04:09:47.961313347 +0000 UTC m=+9.956043539,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 28 04:10:03 crc kubenswrapper[5072]: > Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.672766 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984d9c81fd6395 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:47.961394069 +0000 UTC m=+9.956124271,LastTimestamp:2026-02-28 04:09:47.961394069 +0000 UTC m=+9.956124271,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.677717 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 28 04:10:03 crc kubenswrapper[5072]: &Event{ObjectMeta:{kube-apiserver-crc.18984d9dbefd5c92 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 28 04:10:03 crc kubenswrapper[5072]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 28 04:10:03 crc kubenswrapper[5072]: Feb 28 04:10:03 crc kubenswrapper[5072]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:53.279769746 +0000 UTC m=+15.274499948,LastTimestamp:2026-02-28 04:09:53.279769746 +0000 UTC m=+15.274499948,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 28 04:10:03 crc kubenswrapper[5072]: > Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.681122 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984d9dbefe5f17 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:53.279835927 +0000 UTC m=+15.274566139,LastTimestamp:2026-02-28 04:09:53.279835927 +0000 UTC m=+15.274566139,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.684599 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.18984d9dbefd5c92\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 28 04:10:03 crc kubenswrapper[5072]: &Event{ObjectMeta:{kube-apiserver-crc.18984d9dbefd5c92 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 28 04:10:03 crc kubenswrapper[5072]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 28 04:10:03 crc kubenswrapper[5072]: Feb 28 04:10:03 crc kubenswrapper[5072]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:53.279769746 +0000 UTC m=+15.274499948,LastTimestamp:2026-02-28 04:09:53.285073001 +0000 UTC m=+15.279803203,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 28 04:10:03 crc kubenswrapper[5072]: > Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.688195 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.18984d9dbefe5f17\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984d9dbefe5f17 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:53.279835927 +0000 UTC m=+15.274566139,LastTimestamp:2026-02-28 04:09:53.285135103 +0000 UTC m=+15.279865315,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.691799 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.18984d9b32780e67\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984d9b32780e67 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:42.332288615 +0000 UTC m=+4.327018807,LastTimestamp:2026-02-28 04:09:53.78595144 +0000 UTC m=+15.780681642,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.695016 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.18984d9b3f0f880d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984d9b3f0f880d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:42.543542285 +0000 UTC m=+4.538272477,LastTimestamp:2026-02-28 04:09:53.988313837 +0000 UTC m=+15.983044029,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.698459 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.18984d9b3fad51db\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984d9b3fad51db openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:42.553883099 +0000 UTC m=+4.548613291,LastTimestamp:2026-02-28 04:09:53.999877314 +0000 UTC m=+15.994607506,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.702335 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 28 04:10:03 crc kubenswrapper[5072]: &Event{ObjectMeta:{kube-controller-manager-crc.18984d9ed61630fd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 28 04:10:03 crc kubenswrapper[5072]: body: Feb 28 04:10:03 crc kubenswrapper[5072]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:57.962240253 +0000 UTC m=+19.956970455,LastTimestamp:2026-02-28 04:09:57.962240253 +0000 UTC m=+19.956970455,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 28 04:10:03 crc kubenswrapper[5072]: > Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.705458 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984d9ed616f8c3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:57.962291395 +0000 UTC m=+19.957021597,LastTimestamp:2026-02-28 04:09:57.962291395 +0000 UTC m=+19.957021597,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:03 crc kubenswrapper[5072]: W0228 04:10:03.956620 5072 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Feb 28 04:10:03 crc kubenswrapper[5072]: E0228 04:10:03.956742 5072 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 28 04:10:04 crc kubenswrapper[5072]: I0228 04:10:04.594894 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:05 crc kubenswrapper[5072]: W0228 04:10:05.449327 5072 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Feb 28 04:10:05 crc kubenswrapper[5072]: E0228 04:10:05.449405 5072 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 28 04:10:05 crc kubenswrapper[5072]: I0228 04:10:05.594830 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:06 crc kubenswrapper[5072]: I0228 04:10:06.596550 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:06 crc kubenswrapper[5072]: I0228 04:10:06.701363 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:10:06 crc kubenswrapper[5072]: I0228 04:10:06.704486 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:06 crc kubenswrapper[5072]: I0228 04:10:06.704534 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:06 crc kubenswrapper[5072]: I0228 04:10:06.704545 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:06 crc kubenswrapper[5072]: E0228 04:10:06.704490 5072 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 28 04:10:06 crc kubenswrapper[5072]: I0228 04:10:06.704578 5072 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 04:10:06 crc kubenswrapper[5072]: E0228 04:10:06.709802 5072 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 28 04:10:07 crc kubenswrapper[5072]: I0228 04:10:07.595204 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:07 crc kubenswrapper[5072]: I0228 04:10:07.962222 5072 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 28 04:10:07 crc kubenswrapper[5072]: I0228 04:10:07.962413 5072 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 28 04:10:07 crc kubenswrapper[5072]: I0228 04:10:07.962535 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 04:10:07 crc kubenswrapper[5072]: I0228 04:10:07.963039 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:10:07 crc kubenswrapper[5072]: I0228 04:10:07.965173 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:07 crc kubenswrapper[5072]: I0228 04:10:07.965237 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:07 crc kubenswrapper[5072]: I0228 04:10:07.965265 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:07 crc kubenswrapper[5072]: I0228 04:10:07.966233 5072 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"1950822dc64ce3c84b441fc6dc100ebf4d9669049c08f8c0ef21e6eb30f136bb"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Feb 28 04:10:07 crc kubenswrapper[5072]: I0228 04:10:07.966686 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://1950822dc64ce3c84b441fc6dc100ebf4d9669049c08f8c0ef21e6eb30f136bb" gracePeriod=30 Feb 28 04:10:07 crc kubenswrapper[5072]: E0228 04:10:07.967839 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18984d9ed61630fd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 28 04:10:07 crc kubenswrapper[5072]: &Event{ObjectMeta:{kube-controller-manager-crc.18984d9ed61630fd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 28 04:10:07 crc kubenswrapper[5072]: body: Feb 28 04:10:07 crc kubenswrapper[5072]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:57.962240253 +0000 UTC m=+19.956970455,LastTimestamp:2026-02-28 04:10:07.962362238 +0000 UTC m=+29.957092460,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 28 04:10:07 crc kubenswrapper[5072]: > Feb 28 04:10:07 crc kubenswrapper[5072]: E0228 04:10:07.973116 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18984d9ed616f8c3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984d9ed616f8c3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:57.962291395 +0000 UTC m=+19.957021597,LastTimestamp:2026-02-28 04:10:07.962476131 +0000 UTC m=+29.957206363,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:07 crc kubenswrapper[5072]: E0228 04:10:07.979572 5072 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984da12a64d582 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:10:07.966614914 +0000 UTC m=+29.961345196,LastTimestamp:2026-02-28 04:10:07.966614914 +0000 UTC m=+29.961345196,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:08 crc kubenswrapper[5072]: E0228 04:10:08.090101 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18984d9a99b9a2fb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984d9a99b9a2fb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:39.769672443 +0000 UTC m=+1.764402635,LastTimestamp:2026-02-28 04:10:08.08320442 +0000 UTC m=+30.077934612,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:08 crc kubenswrapper[5072]: E0228 04:10:08.239499 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18984d9acc88a302\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984d9acc88a302 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:40.622099202 +0000 UTC m=+2.616829394,LastTimestamp:2026-02-28 04:10:08.235029101 +0000 UTC m=+30.229759293,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:08 crc kubenswrapper[5072]: E0228 04:10:08.248729 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18984d9acd218304\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984d9acd218304 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:40.63211802 +0000 UTC m=+2.626848212,LastTimestamp:2026-02-28 04:10:08.244384068 +0000 UTC m=+30.239114260,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:08 crc kubenswrapper[5072]: I0228 04:10:08.596937 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:08 crc kubenswrapper[5072]: E0228 04:10:08.724335 5072 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 28 04:10:08 crc kubenswrapper[5072]: I0228 04:10:08.852232 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 28 04:10:08 crc kubenswrapper[5072]: I0228 04:10:08.852475 5072 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="1950822dc64ce3c84b441fc6dc100ebf4d9669049c08f8c0ef21e6eb30f136bb" exitCode=255 Feb 28 04:10:08 crc kubenswrapper[5072]: I0228 04:10:08.852510 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"1950822dc64ce3c84b441fc6dc100ebf4d9669049c08f8c0ef21e6eb30f136bb"} Feb 28 04:10:08 crc kubenswrapper[5072]: I0228 04:10:08.852540 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"34724f49f9d38bf481365aff1b0139a09f926c871e1a6840cfb6fb5d99358448"} Feb 28 04:10:08 crc kubenswrapper[5072]: I0228 04:10:08.852634 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:10:08 crc kubenswrapper[5072]: I0228 04:10:08.853410 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:08 crc kubenswrapper[5072]: I0228 04:10:08.853434 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:08 crc kubenswrapper[5072]: I0228 04:10:08.853444 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:09 crc kubenswrapper[5072]: I0228 04:10:09.595849 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:10 crc kubenswrapper[5072]: I0228 04:10:10.595027 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:11 crc kubenswrapper[5072]: I0228 04:10:11.590608 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 04:10:11 crc kubenswrapper[5072]: I0228 04:10:11.590898 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:10:11 crc kubenswrapper[5072]: I0228 04:10:11.592426 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:11 crc kubenswrapper[5072]: I0228 04:10:11.592471 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:11 crc kubenswrapper[5072]: I0228 04:10:11.592488 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:11 crc kubenswrapper[5072]: I0228 04:10:11.592540 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:12 crc kubenswrapper[5072]: I0228 04:10:12.596543 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:13 crc kubenswrapper[5072]: I0228 04:10:13.595770 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:13 crc kubenswrapper[5072]: E0228 04:10:13.709621 5072 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 28 04:10:13 crc kubenswrapper[5072]: I0228 04:10:13.710547 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:10:13 crc kubenswrapper[5072]: I0228 04:10:13.711771 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:13 crc kubenswrapper[5072]: I0228 04:10:13.711809 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:13 crc kubenswrapper[5072]: I0228 04:10:13.711844 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:13 crc kubenswrapper[5072]: I0228 04:10:13.711869 5072 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 04:10:13 crc kubenswrapper[5072]: E0228 04:10:13.716627 5072 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 28 04:10:14 crc kubenswrapper[5072]: I0228 04:10:14.597372 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:14 crc kubenswrapper[5072]: I0228 04:10:14.961092 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 04:10:14 crc kubenswrapper[5072]: I0228 04:10:14.961367 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:10:14 crc kubenswrapper[5072]: I0228 04:10:14.962739 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:14 crc kubenswrapper[5072]: I0228 04:10:14.962780 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:14 crc kubenswrapper[5072]: I0228 04:10:14.962796 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:15 crc kubenswrapper[5072]: W0228 04:10:15.362198 5072 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Feb 28 04:10:15 crc kubenswrapper[5072]: E0228 04:10:15.362274 5072 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 28 04:10:15 crc kubenswrapper[5072]: I0228 04:10:15.596006 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:15 crc kubenswrapper[5072]: I0228 04:10:15.658488 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:10:15 crc kubenswrapper[5072]: I0228 04:10:15.659921 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:15 crc kubenswrapper[5072]: I0228 04:10:15.659992 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:15 crc kubenswrapper[5072]: I0228 04:10:15.660017 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:15 crc kubenswrapper[5072]: I0228 04:10:15.661063 5072 scope.go:117] "RemoveContainer" containerID="f68a4c5e30201dfd85006a7d26b11b4b6924210802315fc48c69f51cb66c0b7b" Feb 28 04:10:15 crc kubenswrapper[5072]: I0228 04:10:15.873254 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 28 04:10:15 crc kubenswrapper[5072]: I0228 04:10:15.875514 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ab47ff29ae19a7deb48b17ffd8e5336e8e228c8cf412ef4575bd99b48cb69b01"} Feb 28 04:10:15 crc kubenswrapper[5072]: I0228 04:10:15.875770 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:10:15 crc kubenswrapper[5072]: I0228 04:10:15.876565 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:15 crc kubenswrapper[5072]: I0228 04:10:15.876599 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:15 crc kubenswrapper[5072]: I0228 04:10:15.876611 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:16 crc kubenswrapper[5072]: I0228 04:10:16.598450 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:16 crc kubenswrapper[5072]: I0228 04:10:16.880631 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 28 04:10:16 crc kubenswrapper[5072]: I0228 04:10:16.881298 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 28 04:10:16 crc kubenswrapper[5072]: I0228 04:10:16.883624 5072 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ab47ff29ae19a7deb48b17ffd8e5336e8e228c8cf412ef4575bd99b48cb69b01" exitCode=255 Feb 28 04:10:16 crc kubenswrapper[5072]: I0228 04:10:16.883695 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ab47ff29ae19a7deb48b17ffd8e5336e8e228c8cf412ef4575bd99b48cb69b01"} Feb 28 04:10:16 crc kubenswrapper[5072]: I0228 04:10:16.883762 5072 scope.go:117] "RemoveContainer" containerID="f68a4c5e30201dfd85006a7d26b11b4b6924210802315fc48c69f51cb66c0b7b" Feb 28 04:10:16 crc kubenswrapper[5072]: I0228 04:10:16.884030 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:10:16 crc kubenswrapper[5072]: I0228 04:10:16.885121 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:16 crc kubenswrapper[5072]: I0228 04:10:16.886255 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:16 crc kubenswrapper[5072]: I0228 04:10:16.886533 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:16 crc kubenswrapper[5072]: I0228 04:10:16.887855 5072 scope.go:117] "RemoveContainer" containerID="ab47ff29ae19a7deb48b17ffd8e5336e8e228c8cf412ef4575bd99b48cb69b01" Feb 28 04:10:16 crc kubenswrapper[5072]: E0228 04:10:16.888715 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 04:10:17 crc kubenswrapper[5072]: I0228 04:10:17.598818 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:17 crc kubenswrapper[5072]: I0228 04:10:17.889281 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 28 04:10:17 crc kubenswrapper[5072]: I0228 04:10:17.961345 5072 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 28 04:10:17 crc kubenswrapper[5072]: I0228 04:10:17.961431 5072 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 28 04:10:17 crc kubenswrapper[5072]: E0228 04:10:17.966315 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18984d9ed61630fd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 28 04:10:17 crc kubenswrapper[5072]: &Event{ObjectMeta:{kube-controller-manager-crc.18984d9ed61630fd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 28 04:10:17 crc kubenswrapper[5072]: body: Feb 28 04:10:17 crc kubenswrapper[5072]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:57.962240253 +0000 UTC m=+19.956970455,LastTimestamp:2026-02-28 04:10:17.961406772 +0000 UTC m=+39.956136964,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 28 04:10:17 crc kubenswrapper[5072]: > Feb 28 04:10:17 crc kubenswrapper[5072]: E0228 04:10:17.972273 5072 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18984d9ed616f8c3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984d9ed616f8c3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:09:57.962291395 +0000 UTC m=+19.957021597,LastTimestamp:2026-02-28 04:10:17.961500735 +0000 UTC m=+39.956230927,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:10:18 crc kubenswrapper[5072]: I0228 04:10:18.595277 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:18 crc kubenswrapper[5072]: E0228 04:10:18.724594 5072 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 28 04:10:19 crc kubenswrapper[5072]: I0228 04:10:19.595866 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:20 crc kubenswrapper[5072]: W0228 04:10:20.302907 5072 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Feb 28 04:10:20 crc kubenswrapper[5072]: E0228 04:10:20.302986 5072 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 28 04:10:20 crc kubenswrapper[5072]: W0228 04:10:20.516149 5072 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:20 crc kubenswrapper[5072]: E0228 04:10:20.516213 5072 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 28 04:10:20 crc kubenswrapper[5072]: I0228 04:10:20.597437 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:20 crc kubenswrapper[5072]: I0228 04:10:20.640720 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:10:20 crc kubenswrapper[5072]: I0228 04:10:20.641032 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:10:20 crc kubenswrapper[5072]: I0228 04:10:20.644388 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:20 crc kubenswrapper[5072]: I0228 04:10:20.644451 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:20 crc kubenswrapper[5072]: I0228 04:10:20.644468 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:20 crc kubenswrapper[5072]: I0228 04:10:20.645316 5072 scope.go:117] "RemoveContainer" containerID="ab47ff29ae19a7deb48b17ffd8e5336e8e228c8cf412ef4575bd99b48cb69b01" Feb 28 04:10:20 crc kubenswrapper[5072]: E0228 04:10:20.645616 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 04:10:20 crc kubenswrapper[5072]: I0228 04:10:20.717030 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:10:20 crc kubenswrapper[5072]: E0228 04:10:20.717758 5072 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 28 04:10:20 crc kubenswrapper[5072]: I0228 04:10:20.718377 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:20 crc kubenswrapper[5072]: I0228 04:10:20.718408 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:20 crc kubenswrapper[5072]: I0228 04:10:20.718416 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:20 crc kubenswrapper[5072]: I0228 04:10:20.718467 5072 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 04:10:20 crc kubenswrapper[5072]: E0228 04:10:20.724480 5072 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 28 04:10:21 crc kubenswrapper[5072]: I0228 04:10:21.594757 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:22 crc kubenswrapper[5072]: W0228 04:10:22.592180 5072 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Feb 28 04:10:22 crc kubenswrapper[5072]: E0228 04:10:22.592242 5072 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 28 04:10:22 crc kubenswrapper[5072]: I0228 04:10:22.592338 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:23 crc kubenswrapper[5072]: I0228 04:10:23.595217 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:24 crc kubenswrapper[5072]: I0228 04:10:24.597105 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:24 crc kubenswrapper[5072]: I0228 04:10:24.965152 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 04:10:24 crc kubenswrapper[5072]: I0228 04:10:24.965329 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:10:24 crc kubenswrapper[5072]: I0228 04:10:24.966366 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:24 crc kubenswrapper[5072]: I0228 04:10:24.966397 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:24 crc kubenswrapper[5072]: I0228 04:10:24.966408 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:24 crc kubenswrapper[5072]: I0228 04:10:24.969666 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 04:10:25 crc kubenswrapper[5072]: I0228 04:10:25.595760 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:25 crc kubenswrapper[5072]: I0228 04:10:25.918238 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:10:25 crc kubenswrapper[5072]: I0228 04:10:25.919176 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:25 crc kubenswrapper[5072]: I0228 04:10:25.919217 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:25 crc kubenswrapper[5072]: I0228 04:10:25.919230 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:25 crc kubenswrapper[5072]: I0228 04:10:25.951900 5072 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:10:25 crc kubenswrapper[5072]: I0228 04:10:25.952150 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:10:25 crc kubenswrapper[5072]: I0228 04:10:25.953309 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:25 crc kubenswrapper[5072]: I0228 04:10:25.953353 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:25 crc kubenswrapper[5072]: I0228 04:10:25.953368 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:25 crc kubenswrapper[5072]: I0228 04:10:25.954146 5072 scope.go:117] "RemoveContainer" containerID="ab47ff29ae19a7deb48b17ffd8e5336e8e228c8cf412ef4575bd99b48cb69b01" Feb 28 04:10:25 crc kubenswrapper[5072]: E0228 04:10:25.954423 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 04:10:26 crc kubenswrapper[5072]: I0228 04:10:26.594960 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:27 crc kubenswrapper[5072]: I0228 04:10:27.594379 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:27 crc kubenswrapper[5072]: E0228 04:10:27.724620 5072 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 28 04:10:27 crc kubenswrapper[5072]: I0228 04:10:27.724716 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:10:27 crc kubenswrapper[5072]: I0228 04:10:27.726368 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:27 crc kubenswrapper[5072]: I0228 04:10:27.726431 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:27 crc kubenswrapper[5072]: I0228 04:10:27.726446 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:27 crc kubenswrapper[5072]: I0228 04:10:27.726481 5072 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 04:10:27 crc kubenswrapper[5072]: E0228 04:10:27.731198 5072 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 28 04:10:28 crc kubenswrapper[5072]: I0228 04:10:28.593983 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:28 crc kubenswrapper[5072]: E0228 04:10:28.724787 5072 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 28 04:10:29 crc kubenswrapper[5072]: I0228 04:10:29.597418 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:30 crc kubenswrapper[5072]: I0228 04:10:30.595310 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:31 crc kubenswrapper[5072]: I0228 04:10:31.594482 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:32 crc kubenswrapper[5072]: I0228 04:10:32.597268 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:33 crc kubenswrapper[5072]: I0228 04:10:33.186606 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 28 04:10:33 crc kubenswrapper[5072]: I0228 04:10:33.186771 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:10:33 crc kubenswrapper[5072]: I0228 04:10:33.187759 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:33 crc kubenswrapper[5072]: I0228 04:10:33.187820 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:33 crc kubenswrapper[5072]: I0228 04:10:33.187846 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:33 crc kubenswrapper[5072]: I0228 04:10:33.596004 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:34 crc kubenswrapper[5072]: I0228 04:10:34.598145 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:34 crc kubenswrapper[5072]: I0228 04:10:34.731472 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:10:34 crc kubenswrapper[5072]: E0228 04:10:34.731680 5072 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 28 04:10:34 crc kubenswrapper[5072]: I0228 04:10:34.732616 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:34 crc kubenswrapper[5072]: I0228 04:10:34.732674 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:34 crc kubenswrapper[5072]: I0228 04:10:34.732687 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:34 crc kubenswrapper[5072]: I0228 04:10:34.732714 5072 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 04:10:34 crc kubenswrapper[5072]: E0228 04:10:34.737907 5072 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 28 04:10:35 crc kubenswrapper[5072]: I0228 04:10:35.600674 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:36 crc kubenswrapper[5072]: I0228 04:10:36.594724 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:37 crc kubenswrapper[5072]: I0228 04:10:37.595103 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:38 crc kubenswrapper[5072]: I0228 04:10:38.595850 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:38 crc kubenswrapper[5072]: I0228 04:10:38.658385 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:10:38 crc kubenswrapper[5072]: I0228 04:10:38.659570 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:38 crc kubenswrapper[5072]: I0228 04:10:38.659631 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:38 crc kubenswrapper[5072]: I0228 04:10:38.659660 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:38 crc kubenswrapper[5072]: I0228 04:10:38.660213 5072 scope.go:117] "RemoveContainer" containerID="ab47ff29ae19a7deb48b17ffd8e5336e8e228c8cf412ef4575bd99b48cb69b01" Feb 28 04:10:38 crc kubenswrapper[5072]: E0228 04:10:38.725691 5072 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 28 04:10:38 crc kubenswrapper[5072]: I0228 04:10:38.951698 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 28 04:10:38 crc kubenswrapper[5072]: I0228 04:10:38.956144 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9"} Feb 28 04:10:38 crc kubenswrapper[5072]: I0228 04:10:38.956284 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:10:38 crc kubenswrapper[5072]: I0228 04:10:38.957109 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:38 crc kubenswrapper[5072]: I0228 04:10:38.957133 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:38 crc kubenswrapper[5072]: I0228 04:10:38.957142 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:39 crc kubenswrapper[5072]: I0228 04:10:39.595027 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:39 crc kubenswrapper[5072]: I0228 04:10:39.960991 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 28 04:10:39 crc kubenswrapper[5072]: I0228 04:10:39.961503 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 28 04:10:39 crc kubenswrapper[5072]: I0228 04:10:39.963191 5072 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9" exitCode=255 Feb 28 04:10:39 crc kubenswrapper[5072]: I0228 04:10:39.963235 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9"} Feb 28 04:10:39 crc kubenswrapper[5072]: I0228 04:10:39.963278 5072 scope.go:117] "RemoveContainer" containerID="ab47ff29ae19a7deb48b17ffd8e5336e8e228c8cf412ef4575bd99b48cb69b01" Feb 28 04:10:39 crc kubenswrapper[5072]: I0228 04:10:39.963420 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:10:39 crc kubenswrapper[5072]: I0228 04:10:39.964215 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:39 crc kubenswrapper[5072]: I0228 04:10:39.964242 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:39 crc kubenswrapper[5072]: I0228 04:10:39.964253 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:39 crc kubenswrapper[5072]: I0228 04:10:39.964708 5072 scope.go:117] "RemoveContainer" containerID="16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9" Feb 28 04:10:39 crc kubenswrapper[5072]: E0228 04:10:39.964855 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 04:10:40 crc kubenswrapper[5072]: I0228 04:10:40.595398 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:40 crc kubenswrapper[5072]: I0228 04:10:40.640611 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:10:40 crc kubenswrapper[5072]: I0228 04:10:40.967763 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 28 04:10:40 crc kubenswrapper[5072]: I0228 04:10:40.970800 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:10:40 crc kubenswrapper[5072]: I0228 04:10:40.971458 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:40 crc kubenswrapper[5072]: I0228 04:10:40.971487 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:40 crc kubenswrapper[5072]: I0228 04:10:40.971495 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:40 crc kubenswrapper[5072]: I0228 04:10:40.971989 5072 scope.go:117] "RemoveContainer" containerID="16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9" Feb 28 04:10:40 crc kubenswrapper[5072]: E0228 04:10:40.972145 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 04:10:41 crc kubenswrapper[5072]: I0228 04:10:41.594188 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:41 crc kubenswrapper[5072]: E0228 04:10:41.735830 5072 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 28 04:10:41 crc kubenswrapper[5072]: I0228 04:10:41.738705 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:10:41 crc kubenswrapper[5072]: I0228 04:10:41.739800 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:41 crc kubenswrapper[5072]: I0228 04:10:41.739835 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:41 crc kubenswrapper[5072]: I0228 04:10:41.739844 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:41 crc kubenswrapper[5072]: I0228 04:10:41.739867 5072 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 04:10:41 crc kubenswrapper[5072]: E0228 04:10:41.744487 5072 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 28 04:10:42 crc kubenswrapper[5072]: I0228 04:10:42.594936 5072 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 04:10:42 crc kubenswrapper[5072]: I0228 04:10:42.972193 5072 csr.go:261] certificate signing request csr-slgk2 is approved, waiting to be issued Feb 28 04:10:42 crc kubenswrapper[5072]: I0228 04:10:42.979926 5072 csr.go:257] certificate signing request csr-slgk2 is issued Feb 28 04:10:43 crc kubenswrapper[5072]: I0228 04:10:43.011962 5072 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 28 04:10:43 crc kubenswrapper[5072]: I0228 04:10:43.448427 5072 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 28 04:10:43 crc kubenswrapper[5072]: I0228 04:10:43.980928 5072 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-13 05:54:01.05308901 +0000 UTC Feb 28 04:10:43 crc kubenswrapper[5072]: I0228 04:10:43.980968 5072 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7657h43m17.072124435s for next certificate rotation Feb 28 04:10:45 crc kubenswrapper[5072]: I0228 04:10:45.951735 5072 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:10:45 crc kubenswrapper[5072]: I0228 04:10:45.951894 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:10:45 crc kubenswrapper[5072]: I0228 04:10:45.952920 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:45 crc kubenswrapper[5072]: I0228 04:10:45.952957 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:45 crc kubenswrapper[5072]: I0228 04:10:45.953002 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:45 crc kubenswrapper[5072]: I0228 04:10:45.953686 5072 scope.go:117] "RemoveContainer" containerID="16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9" Feb 28 04:10:45 crc kubenswrapper[5072]: E0228 04:10:45.953887 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 04:10:48 crc kubenswrapper[5072]: E0228 04:10:48.726294 5072 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 28 04:10:48 crc kubenswrapper[5072]: I0228 04:10:48.745622 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:10:48 crc kubenswrapper[5072]: I0228 04:10:48.748884 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:48 crc kubenswrapper[5072]: I0228 04:10:48.749065 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:48 crc kubenswrapper[5072]: I0228 04:10:48.749176 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:48 crc kubenswrapper[5072]: I0228 04:10:48.749455 5072 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 04:10:48 crc kubenswrapper[5072]: I0228 04:10:48.756216 5072 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 28 04:10:48 crc kubenswrapper[5072]: I0228 04:10:48.756822 5072 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 28 04:10:48 crc kubenswrapper[5072]: E0228 04:10:48.756935 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 28 04:10:48 crc kubenswrapper[5072]: I0228 04:10:48.761401 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:48 crc kubenswrapper[5072]: I0228 04:10:48.761437 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:48 crc kubenswrapper[5072]: I0228 04:10:48.761449 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:48 crc kubenswrapper[5072]: I0228 04:10:48.761472 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:10:48 crc kubenswrapper[5072]: I0228 04:10:48.761488 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:10:48Z","lastTransitionTime":"2026-02-28T04:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:10:48 crc kubenswrapper[5072]: E0228 04:10:48.778281 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 04:10:48 crc kubenswrapper[5072]: I0228 04:10:48.788297 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:48 crc kubenswrapper[5072]: I0228 04:10:48.788341 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:48 crc kubenswrapper[5072]: I0228 04:10:48.788354 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:48 crc kubenswrapper[5072]: I0228 04:10:48.788376 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:10:48 crc kubenswrapper[5072]: I0228 04:10:48.788389 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:10:48Z","lastTransitionTime":"2026-02-28T04:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:10:48 crc kubenswrapper[5072]: E0228 04:10:48.804309 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 04:10:48 crc kubenswrapper[5072]: I0228 04:10:48.814311 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:48 crc kubenswrapper[5072]: I0228 04:10:48.814698 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:48 crc kubenswrapper[5072]: I0228 04:10:48.814867 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:48 crc kubenswrapper[5072]: I0228 04:10:48.815008 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:10:48 crc kubenswrapper[5072]: I0228 04:10:48.815129 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:10:48Z","lastTransitionTime":"2026-02-28T04:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:10:48 crc kubenswrapper[5072]: E0228 04:10:48.828056 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 04:10:48 crc kubenswrapper[5072]: I0228 04:10:48.835311 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:48 crc kubenswrapper[5072]: I0228 04:10:48.835368 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:48 crc kubenswrapper[5072]: I0228 04:10:48.835379 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:48 crc kubenswrapper[5072]: I0228 04:10:48.835399 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:10:48 crc kubenswrapper[5072]: I0228 04:10:48.835417 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:10:48Z","lastTransitionTime":"2026-02-28T04:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:10:48 crc kubenswrapper[5072]: E0228 04:10:48.850294 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 04:10:48 crc kubenswrapper[5072]: E0228 04:10:48.850453 5072 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 28 04:10:48 crc kubenswrapper[5072]: E0228 04:10:48.850479 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:48 crc kubenswrapper[5072]: E0228 04:10:48.951417 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:49 crc kubenswrapper[5072]: E0228 04:10:49.052306 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:49 crc kubenswrapper[5072]: E0228 04:10:49.153270 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:49 crc kubenswrapper[5072]: E0228 04:10:49.253857 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:49 crc kubenswrapper[5072]: E0228 04:10:49.354884 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:49 crc kubenswrapper[5072]: E0228 04:10:49.455567 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:49 crc kubenswrapper[5072]: E0228 04:10:49.556539 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:49 crc kubenswrapper[5072]: E0228 04:10:49.657019 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:49 crc kubenswrapper[5072]: E0228 04:10:49.757341 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:49 crc kubenswrapper[5072]: E0228 04:10:49.858453 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:49 crc kubenswrapper[5072]: E0228 04:10:49.958679 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:50 crc kubenswrapper[5072]: E0228 04:10:50.059043 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:50 crc kubenswrapper[5072]: E0228 04:10:50.160217 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:50 crc kubenswrapper[5072]: E0228 04:10:50.261150 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:50 crc kubenswrapper[5072]: E0228 04:10:50.361808 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:50 crc kubenswrapper[5072]: E0228 04:10:50.462196 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:50 crc kubenswrapper[5072]: E0228 04:10:50.562720 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:50 crc kubenswrapper[5072]: E0228 04:10:50.663543 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:50 crc kubenswrapper[5072]: E0228 04:10:50.764618 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:50 crc kubenswrapper[5072]: E0228 04:10:50.864897 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:50 crc kubenswrapper[5072]: E0228 04:10:50.965517 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:51 crc kubenswrapper[5072]: E0228 04:10:51.066320 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:51 crc kubenswrapper[5072]: E0228 04:10:51.166732 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:51 crc kubenswrapper[5072]: E0228 04:10:51.267302 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:51 crc kubenswrapper[5072]: E0228 04:10:51.367476 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:51 crc kubenswrapper[5072]: E0228 04:10:51.468738 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:51 crc kubenswrapper[5072]: E0228 04:10:51.569694 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:51 crc kubenswrapper[5072]: E0228 04:10:51.670825 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:51 crc kubenswrapper[5072]: E0228 04:10:51.771436 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:51 crc kubenswrapper[5072]: E0228 04:10:51.871774 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:51 crc kubenswrapper[5072]: E0228 04:10:51.972328 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:52 crc kubenswrapper[5072]: E0228 04:10:52.072486 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:52 crc kubenswrapper[5072]: E0228 04:10:52.172628 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:52 crc kubenswrapper[5072]: E0228 04:10:52.273153 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:52 crc kubenswrapper[5072]: E0228 04:10:52.373477 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:52 crc kubenswrapper[5072]: E0228 04:10:52.473695 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:52 crc kubenswrapper[5072]: E0228 04:10:52.574770 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:52 crc kubenswrapper[5072]: E0228 04:10:52.675138 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:52 crc kubenswrapper[5072]: E0228 04:10:52.775218 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:52 crc kubenswrapper[5072]: E0228 04:10:52.876361 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:52 crc kubenswrapper[5072]: E0228 04:10:52.977239 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:53 crc kubenswrapper[5072]: E0228 04:10:53.077815 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:53 crc kubenswrapper[5072]: E0228 04:10:53.177964 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:53 crc kubenswrapper[5072]: E0228 04:10:53.278986 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:53 crc kubenswrapper[5072]: E0228 04:10:53.379817 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:53 crc kubenswrapper[5072]: E0228 04:10:53.480417 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:53 crc kubenswrapper[5072]: E0228 04:10:53.581368 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:53 crc kubenswrapper[5072]: E0228 04:10:53.681550 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:53 crc kubenswrapper[5072]: E0228 04:10:53.782370 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:53 crc kubenswrapper[5072]: E0228 04:10:53.882493 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:53 crc kubenswrapper[5072]: E0228 04:10:53.983333 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:54 crc kubenswrapper[5072]: E0228 04:10:54.084492 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:54 crc kubenswrapper[5072]: E0228 04:10:54.185595 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:54 crc kubenswrapper[5072]: E0228 04:10:54.286488 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:54 crc kubenswrapper[5072]: E0228 04:10:54.386754 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:54 crc kubenswrapper[5072]: E0228 04:10:54.487627 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:54 crc kubenswrapper[5072]: E0228 04:10:54.588464 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:54 crc kubenswrapper[5072]: E0228 04:10:54.689465 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:54 crc kubenswrapper[5072]: E0228 04:10:54.790312 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:54 crc kubenswrapper[5072]: E0228 04:10:54.891219 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:54 crc kubenswrapper[5072]: E0228 04:10:54.991366 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:55 crc kubenswrapper[5072]: E0228 04:10:55.092036 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:55 crc kubenswrapper[5072]: E0228 04:10:55.192177 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:55 crc kubenswrapper[5072]: E0228 04:10:55.293315 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:55 crc kubenswrapper[5072]: E0228 04:10:55.393499 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:55 crc kubenswrapper[5072]: E0228 04:10:55.493716 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:55 crc kubenswrapper[5072]: E0228 04:10:55.594720 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:55 crc kubenswrapper[5072]: E0228 04:10:55.695175 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:55 crc kubenswrapper[5072]: E0228 04:10:55.796254 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:55 crc kubenswrapper[5072]: E0228 04:10:55.897168 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:55 crc kubenswrapper[5072]: E0228 04:10:55.997576 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:56 crc kubenswrapper[5072]: E0228 04:10:56.098243 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:56 crc kubenswrapper[5072]: E0228 04:10:56.199362 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:56 crc kubenswrapper[5072]: E0228 04:10:56.300231 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:56 crc kubenswrapper[5072]: E0228 04:10:56.400347 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:56 crc kubenswrapper[5072]: E0228 04:10:56.500871 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:56 crc kubenswrapper[5072]: E0228 04:10:56.601685 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:56 crc kubenswrapper[5072]: E0228 04:10:56.702712 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:57 crc kubenswrapper[5072]: E0228 04:10:57.448547 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:57 crc kubenswrapper[5072]: E0228 04:10:57.549501 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:57 crc kubenswrapper[5072]: E0228 04:10:57.650517 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:57 crc kubenswrapper[5072]: E0228 04:10:57.750768 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:57 crc kubenswrapper[5072]: E0228 04:10:57.850917 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:57 crc kubenswrapper[5072]: E0228 04:10:57.951334 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:58 crc kubenswrapper[5072]: E0228 04:10:58.051999 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:58 crc kubenswrapper[5072]: E0228 04:10:58.152727 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:58 crc kubenswrapper[5072]: E0228 04:10:58.253825 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:58 crc kubenswrapper[5072]: E0228 04:10:58.354797 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:58 crc kubenswrapper[5072]: E0228 04:10:58.454916 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:58 crc kubenswrapper[5072]: I0228 04:10:58.466137 5072 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 28 04:10:58 crc kubenswrapper[5072]: E0228 04:10:58.556014 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:58 crc kubenswrapper[5072]: E0228 04:10:58.657034 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:58 crc kubenswrapper[5072]: E0228 04:10:58.727486 5072 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 28 04:10:58 crc kubenswrapper[5072]: E0228 04:10:58.758107 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:58 crc kubenswrapper[5072]: E0228 04:10:58.859226 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:58 crc kubenswrapper[5072]: E0228 04:10:58.960140 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:59 crc kubenswrapper[5072]: E0228 04:10:59.061250 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:59 crc kubenswrapper[5072]: E0228 04:10:59.162549 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:59 crc kubenswrapper[5072]: E0228 04:10:59.168696 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 28 04:10:59 crc kubenswrapper[5072]: I0228 04:10:59.172387 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:59 crc kubenswrapper[5072]: I0228 04:10:59.172439 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:59 crc kubenswrapper[5072]: I0228 04:10:59.172451 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:59 crc kubenswrapper[5072]: I0228 04:10:59.172467 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:10:59 crc kubenswrapper[5072]: I0228 04:10:59.172480 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:10:59Z","lastTransitionTime":"2026-02-28T04:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:10:59 crc kubenswrapper[5072]: E0228 04:10:59.181714 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 04:10:59 crc kubenswrapper[5072]: I0228 04:10:59.184664 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:59 crc kubenswrapper[5072]: I0228 04:10:59.184708 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:59 crc kubenswrapper[5072]: I0228 04:10:59.184723 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:59 crc kubenswrapper[5072]: I0228 04:10:59.184741 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:10:59 crc kubenswrapper[5072]: I0228 04:10:59.184750 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:10:59Z","lastTransitionTime":"2026-02-28T04:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:10:59 crc kubenswrapper[5072]: E0228 04:10:59.193092 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 04:10:59 crc kubenswrapper[5072]: I0228 04:10:59.195882 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:59 crc kubenswrapper[5072]: I0228 04:10:59.195956 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:59 crc kubenswrapper[5072]: I0228 04:10:59.195970 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:59 crc kubenswrapper[5072]: I0228 04:10:59.195988 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:10:59 crc kubenswrapper[5072]: I0228 04:10:59.195998 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:10:59Z","lastTransitionTime":"2026-02-28T04:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:10:59 crc kubenswrapper[5072]: E0228 04:10:59.203841 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 04:10:59 crc kubenswrapper[5072]: I0228 04:10:59.206486 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:59 crc kubenswrapper[5072]: I0228 04:10:59.206517 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:59 crc kubenswrapper[5072]: I0228 04:10:59.206526 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:59 crc kubenswrapper[5072]: I0228 04:10:59.206542 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:10:59 crc kubenswrapper[5072]: I0228 04:10:59.206551 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:10:59Z","lastTransitionTime":"2026-02-28T04:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:10:59 crc kubenswrapper[5072]: E0228 04:10:59.214169 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 04:10:59 crc kubenswrapper[5072]: E0228 04:10:59.214317 5072 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 28 04:10:59 crc kubenswrapper[5072]: E0228 04:10:59.263570 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:59 crc kubenswrapper[5072]: E0228 04:10:59.363913 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:59 crc kubenswrapper[5072]: E0228 04:10:59.464447 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:59 crc kubenswrapper[5072]: E0228 04:10:59.565583 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:59 crc kubenswrapper[5072]: I0228 04:10:59.659077 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:10:59 crc kubenswrapper[5072]: I0228 04:10:59.660090 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:10:59 crc kubenswrapper[5072]: I0228 04:10:59.660147 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:10:59 crc kubenswrapper[5072]: I0228 04:10:59.660164 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:10:59 crc kubenswrapper[5072]: E0228 04:10:59.666370 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:59 crc kubenswrapper[5072]: E0228 04:10:59.766682 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:59 crc kubenswrapper[5072]: E0228 04:10:59.867507 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:10:59 crc kubenswrapper[5072]: E0228 04:10:59.995803 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:00 crc kubenswrapper[5072]: E0228 04:11:00.096494 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:00 crc kubenswrapper[5072]: E0228 04:11:00.196857 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:00 crc kubenswrapper[5072]: E0228 04:11:00.297455 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:00 crc kubenswrapper[5072]: E0228 04:11:00.398133 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:00 crc kubenswrapper[5072]: I0228 04:11:00.398828 5072 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 28 04:11:00 crc kubenswrapper[5072]: E0228 04:11:00.498628 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:00 crc kubenswrapper[5072]: E0228 04:11:00.599049 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:00 crc kubenswrapper[5072]: I0228 04:11:00.658442 5072 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 04:11:00 crc kubenswrapper[5072]: I0228 04:11:00.659416 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:00 crc kubenswrapper[5072]: I0228 04:11:00.659526 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:00 crc kubenswrapper[5072]: I0228 04:11:00.659610 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:00 crc kubenswrapper[5072]: I0228 04:11:00.660329 5072 scope.go:117] "RemoveContainer" containerID="16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9" Feb 28 04:11:00 crc kubenswrapper[5072]: E0228 04:11:00.660596 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 04:11:00 crc kubenswrapper[5072]: E0228 04:11:00.700143 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:00 crc kubenswrapper[5072]: E0228 04:11:00.801167 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:00 crc kubenswrapper[5072]: E0228 04:11:00.902033 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:01 crc kubenswrapper[5072]: E0228 04:11:01.003073 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:01 crc kubenswrapper[5072]: E0228 04:11:01.103581 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:01 crc kubenswrapper[5072]: E0228 04:11:01.204764 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:01 crc kubenswrapper[5072]: E0228 04:11:01.305038 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:01 crc kubenswrapper[5072]: E0228 04:11:01.406107 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:01 crc kubenswrapper[5072]: E0228 04:11:01.506964 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:01 crc kubenswrapper[5072]: E0228 04:11:01.607115 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:01 crc kubenswrapper[5072]: E0228 04:11:01.707685 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:01 crc kubenswrapper[5072]: E0228 04:11:01.808780 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:01 crc kubenswrapper[5072]: E0228 04:11:01.909585 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:02 crc kubenswrapper[5072]: E0228 04:11:02.009693 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:02 crc kubenswrapper[5072]: E0228 04:11:02.110705 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:02 crc kubenswrapper[5072]: E0228 04:11:02.211395 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:02 crc kubenswrapper[5072]: E0228 04:11:02.312074 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:02 crc kubenswrapper[5072]: E0228 04:11:02.412500 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:02 crc kubenswrapper[5072]: E0228 04:11:02.513222 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:02 crc kubenswrapper[5072]: E0228 04:11:02.613502 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:02 crc kubenswrapper[5072]: E0228 04:11:02.714045 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:02 crc kubenswrapper[5072]: E0228 04:11:02.814738 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:02 crc kubenswrapper[5072]: E0228 04:11:02.915093 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:03 crc kubenswrapper[5072]: E0228 04:11:03.016038 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:03 crc kubenswrapper[5072]: I0228 04:11:03.102953 5072 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 28 04:11:03 crc kubenswrapper[5072]: E0228 04:11:03.116546 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:03 crc kubenswrapper[5072]: E0228 04:11:03.217013 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:03 crc kubenswrapper[5072]: E0228 04:11:03.318071 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:03 crc kubenswrapper[5072]: E0228 04:11:03.418725 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:03 crc kubenswrapper[5072]: E0228 04:11:03.519573 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:03 crc kubenswrapper[5072]: E0228 04:11:03.619827 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:03 crc kubenswrapper[5072]: E0228 04:11:03.720129 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:03 crc kubenswrapper[5072]: E0228 04:11:03.820747 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:03 crc kubenswrapper[5072]: E0228 04:11:03.921065 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:04 crc kubenswrapper[5072]: E0228 04:11:04.021957 5072 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.068473 5072 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.124718 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.124748 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.124758 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.124772 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.124781 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:04Z","lastTransitionTime":"2026-02-28T04:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.227015 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.227045 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.227054 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.227068 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.227079 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:04Z","lastTransitionTime":"2026-02-28T04:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.329622 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.329678 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.329688 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.329704 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.329712 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:04Z","lastTransitionTime":"2026-02-28T04:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.432779 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.432841 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.432862 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.432878 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.432911 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:04Z","lastTransitionTime":"2026-02-28T04:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.535467 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.535502 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.535510 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.535523 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.535532 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:04Z","lastTransitionTime":"2026-02-28T04:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.638569 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.638603 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.638612 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.638627 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.638636 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:04Z","lastTransitionTime":"2026-02-28T04:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.670266 5072 apiserver.go:52] "Watching apiserver" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.687274 5072 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.687572 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.688069 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:04 crc kubenswrapper[5072]: E0228 04:11:04.688154 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.688332 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:04 crc kubenswrapper[5072]: E0228 04:11:04.688370 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.688069 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.688431 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.688407 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.688582 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 28 04:11:04 crc kubenswrapper[5072]: E0228 04:11:04.688632 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.692071 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.693158 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.693971 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.694176 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.694474 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.695211 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.695531 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.695689 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.695790 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.698196 5072 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.712200 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.726305 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.726352 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.726380 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.726404 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.726448 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.726472 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.726494 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.726516 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.726539 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.726563 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.726583 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.726603 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.726623 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.726667 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.726692 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.726776 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.726805 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.726835 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.726864 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.726888 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.726924 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.726949 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.726976 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727001 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727027 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727051 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727083 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727107 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727128 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727149 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727171 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727195 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727217 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727239 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727262 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727283 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727304 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727326 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727369 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727396 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727418 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727446 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727469 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727491 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727521 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727553 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727603 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727628 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727673 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727695 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727717 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727739 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727761 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727782 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727804 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727827 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727857 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727878 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727900 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727923 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727945 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727966 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.727988 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728011 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728032 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728054 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728076 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728098 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728125 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728147 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728170 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728193 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728215 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728237 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728260 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728280 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728304 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728326 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728349 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728372 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728394 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728414 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728435 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728458 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728481 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728504 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728526 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728549 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728571 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728595 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728617 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728643 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728705 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728745 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728774 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728865 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728889 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728912 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728935 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728957 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.728978 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729040 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729066 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729091 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729114 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729136 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729159 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729181 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729206 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729229 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729251 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729274 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729297 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729319 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729342 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729365 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729388 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729412 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729436 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729461 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729483 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729507 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729529 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729553 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729576 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729598 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729620 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729664 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729691 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729715 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729739 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729761 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729783 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729806 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729830 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729854 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729882 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729903 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729939 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.729984 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730021 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730054 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730079 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730102 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730125 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730149 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730173 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730197 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730219 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730243 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730267 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730293 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730320 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730351 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730383 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730418 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730442 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730465 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730488 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730510 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730533 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730555 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730582 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730605 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730634 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730681 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730704 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730729 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730753 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730795 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730820 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730844 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730868 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730891 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730914 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730939 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730963 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.730986 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.731009 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.731032 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.731055 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.731079 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.731106 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.731132 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.731163 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.731197 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.731227 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.731252 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.731275 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.731299 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.731685 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.731985 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.732130 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.732371 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.732468 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.732565 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.732659 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.732749 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.732846 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.732928 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.733005 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.733074 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.733148 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.733222 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.733293 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.733369 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.733491 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.733624 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.733802 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.733899 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.733972 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.734041 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.734174 5072 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.734255 5072 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.732860 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.733273 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: E0228 04:11:04.734407 5072 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.736971 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: E0228 04:11:04.737079 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 04:11:05.236971631 +0000 UTC m=+87.231701833 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.737296 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.737354 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.737451 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.738082 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.738227 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.738464 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.738634 5072 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.748211 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.754462 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.754600 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.738730 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.737127 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.738979 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.735154 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.735519 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.735567 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.735965 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.736199 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.736286 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.736427 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.736500 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.736520 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.736566 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.736730 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.736854 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.739326 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.739400 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.739441 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.739520 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.739735 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.739798 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.739789 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.739636 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: E0228 04:11:04.739944 5072 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 04:11:04 crc kubenswrapper[5072]: E0228 04:11:04.755062 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 04:11:05.255043002 +0000 UTC m=+87.249773264 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.740024 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.739986 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.740503 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.740716 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.740769 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.741023 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.741103 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.741220 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.741339 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.755141 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.741536 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.741630 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.741778 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.741326 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.742048 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.742128 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.742539 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.742910 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.743212 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.743399 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.743973 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.744017 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: E0228 04:11:04.755337 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.744245 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.744669 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.745006 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.745287 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.745365 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.745213 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.734928 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.745785 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.745803 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.746109 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.746208 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.746243 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.746696 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.746726 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.747131 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.747406 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.748834 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.750340 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.750828 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.751029 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.751353 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.751933 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.752503 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.752765 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: E0228 04:11:04.755506 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 04:11:04 crc kubenswrapper[5072]: E0228 04:11:04.755517 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 04:11:04 crc kubenswrapper[5072]: E0228 04:11:04.755530 5072 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:11:04 crc kubenswrapper[5072]: E0228 04:11:04.755567 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-28 04:11:05.255557098 +0000 UTC m=+87.250287410 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:11:04 crc kubenswrapper[5072]: E0228 04:11:04.755584 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 04:11:04 crc kubenswrapper[5072]: E0228 04:11:04.755593 5072 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:11:04 crc kubenswrapper[5072]: E0228 04:11:04.755619 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-28 04:11:05.255611729 +0000 UTC m=+87.250342021 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.755841 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.756008 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.756183 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.756349 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.756508 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.756766 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.757063 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.757116 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.757375 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.757532 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.758042 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.758934 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.759037 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.759636 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.760055 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.760210 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.760476 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.760899 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.760946 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.760991 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.761174 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.761302 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.761573 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.761658 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.761824 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.761948 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.762045 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.762285 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.763586 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.763837 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.764360 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.764730 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.765020 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.765375 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.765780 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.765842 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.765874 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.766310 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.766440 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.766354 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.766832 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.767378 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.767577 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.767824 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.767870 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.768335 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.768330 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: E0228 04:11:04.768778 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:11:05.268759643 +0000 UTC m=+87.263489835 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.769018 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.769141 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.769145 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.769401 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.769468 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.769482 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.769487 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.769569 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.770762 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.770919 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.770961 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.770966 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.769944 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.770277 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.770311 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.770357 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.770494 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.771142 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.770533 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.770540 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.771242 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.769876 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.771435 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.771569 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.771815 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.771902 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.772027 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.772047 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.772179 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.772243 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.772304 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.772332 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.772391 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:04Z","lastTransitionTime":"2026-02-28T04:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.773180 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.773229 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.773483 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.773558 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.773605 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.774024 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.774029 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.774074 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.774343 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.774903 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.775069 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.775086 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.775609 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.775656 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.776150 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.776550 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.776788 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.777044 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.777299 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.778321 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.778734 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.780937 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.785346 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.785406 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.785511 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.785579 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.788826 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.789894 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.790087 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.790389 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.790471 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.790704 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.790903 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.790961 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.790959 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.791243 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.794045 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.797493 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.797521 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.797784 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.797817 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.797881 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.798387 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.804416 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.805951 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.806458 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.810045 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.820851 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.827107 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.831273 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.835577 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.835687 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.835818 5072 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.835842 5072 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.835855 5072 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.835864 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.835875 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.835883 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.835892 5072 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.835900 5072 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.835911 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.835922 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.835935 5072 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.835946 5072 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.835958 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.835966 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.835974 5072 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.835982 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.835990 5072 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836001 5072 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836009 5072 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836018 5072 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836027 5072 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836036 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836045 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836054 5072 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836064 5072 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836075 5072 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836085 5072 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836095 5072 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836106 5072 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836116 5072 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836127 5072 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836138 5072 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836148 5072 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836155 5072 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836163 5072 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836171 5072 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836179 5072 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836187 5072 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836195 5072 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836203 5072 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836211 5072 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836219 5072 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836227 5072 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836235 5072 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836245 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836254 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836262 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836270 5072 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836278 5072 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836288 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836298 5072 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836307 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836315 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836322 5072 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836330 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836338 5072 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836345 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836355 5072 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836364 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836372 5072 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836380 5072 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836390 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836400 5072 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836412 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836420 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836428 5072 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836436 5072 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836444 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836451 5072 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836459 5072 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836466 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836474 5072 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836483 5072 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836490 5072 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836498 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836506 5072 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836514 5072 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836523 5072 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836530 5072 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836538 5072 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836546 5072 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836554 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836564 5072 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836575 5072 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836588 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836600 5072 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836609 5072 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836617 5072 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836626 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836635 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836669 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836680 5072 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836692 5072 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836703 5072 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836714 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836726 5072 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836737 5072 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836749 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836761 5072 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836773 5072 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836781 5072 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836790 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836799 5072 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836807 5072 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836817 5072 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836827 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836840 5072 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836854 5072 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836867 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836875 5072 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836884 5072 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836892 5072 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836900 5072 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836908 5072 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836919 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836930 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836942 5072 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836955 5072 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836965 5072 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836973 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836981 5072 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836989 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.836997 5072 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837006 5072 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837017 5072 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837028 5072 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837038 5072 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837048 5072 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837059 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837070 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837079 5072 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837087 5072 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837097 5072 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837107 5072 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837117 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837128 5072 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837138 5072 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837148 5072 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837160 5072 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837170 5072 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837179 5072 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837190 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837201 5072 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837213 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837223 5072 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837233 5072 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837244 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837253 5072 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837263 5072 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837272 5072 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837282 5072 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837292 5072 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837301 5072 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837310 5072 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837320 5072 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837332 5072 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837344 5072 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837355 5072 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837365 5072 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837375 5072 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837393 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837403 5072 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837414 5072 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837423 5072 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837434 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837444 5072 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837454 5072 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837464 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837476 5072 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837486 5072 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837496 5072 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837507 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837520 5072 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837532 5072 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837545 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837558 5072 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837568 5072 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837580 5072 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837592 5072 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837602 5072 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837613 5072 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837623 5072 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837634 5072 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837666 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837677 5072 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837685 5072 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837694 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837705 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837717 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837727 5072 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837737 5072 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.837747 5072 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.838564 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.839047 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.875407 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.875449 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.875459 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.875474 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.875483 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:04Z","lastTransitionTime":"2026-02-28T04:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.977175 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.977213 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.977225 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.977244 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:04 crc kubenswrapper[5072]: I0228 04:11:04.977256 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:04Z","lastTransitionTime":"2026-02-28T04:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.007423 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.017468 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.021491 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 28 04:11:05 crc kubenswrapper[5072]: W0228 04:11:05.026739 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-3718623047e9fdd491517ede4001c324903ea8fe2fd2a8fd13ed0532230ef89d WatchSource:0}: Error finding container 3718623047e9fdd491517ede4001c324903ea8fe2fd2a8fd13ed0532230ef89d: Status 404 returned error can't find the container with id 3718623047e9fdd491517ede4001c324903ea8fe2fd2a8fd13ed0532230ef89d Feb 28 04:11:05 crc kubenswrapper[5072]: W0228 04:11:05.038666 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-a4d962de20c343f165db4ab649292dfa37cb74d1ffe049d7a77656ba93b92ee1 WatchSource:0}: Error finding container a4d962de20c343f165db4ab649292dfa37cb74d1ffe049d7a77656ba93b92ee1: Status 404 returned error can't find the container with id a4d962de20c343f165db4ab649292dfa37cb74d1ffe049d7a77656ba93b92ee1 Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.084677 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.084742 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.084752 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.084771 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.084785 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:05Z","lastTransitionTime":"2026-02-28T04:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.186619 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.186672 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.186686 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.186705 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.186717 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:05Z","lastTransitionTime":"2026-02-28T04:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.241167 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:05 crc kubenswrapper[5072]: E0228 04:11:05.241251 5072 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 04:11:05 crc kubenswrapper[5072]: E0228 04:11:05.241306 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 04:11:06.241292788 +0000 UTC m=+88.236022980 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.288387 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.288422 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.288435 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.288451 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.288462 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:05Z","lastTransitionTime":"2026-02-28T04:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.342053 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.342166 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:05 crc kubenswrapper[5072]: E0228 04:11:05.342209 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:11:06.342188122 +0000 UTC m=+88.336918304 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.342234 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.342267 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:05 crc kubenswrapper[5072]: E0228 04:11:05.342384 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 04:11:05 crc kubenswrapper[5072]: E0228 04:11:05.342385 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 04:11:05 crc kubenswrapper[5072]: E0228 04:11:05.342398 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 04:11:05 crc kubenswrapper[5072]: E0228 04:11:05.342407 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 04:11:05 crc kubenswrapper[5072]: E0228 04:11:05.342409 5072 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:11:05 crc kubenswrapper[5072]: E0228 04:11:05.342419 5072 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:11:05 crc kubenswrapper[5072]: E0228 04:11:05.342451 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-28 04:11:06.34244357 +0000 UTC m=+88.337173762 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:11:05 crc kubenswrapper[5072]: E0228 04:11:05.342483 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-28 04:11:06.342472281 +0000 UTC m=+88.337202473 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:11:05 crc kubenswrapper[5072]: E0228 04:11:05.342557 5072 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 04:11:05 crc kubenswrapper[5072]: E0228 04:11:05.342751 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 04:11:06.342714987 +0000 UTC m=+88.337445379 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.390174 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.390224 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.390236 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.390252 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.390262 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:05Z","lastTransitionTime":"2026-02-28T04:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.470908 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"a4d962de20c343f165db4ab649292dfa37cb74d1ffe049d7a77656ba93b92ee1"} Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.472156 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06"} Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.472207 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"3718623047e9fdd491517ede4001c324903ea8fe2fd2a8fd13ed0532230ef89d"} Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.473668 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59"} Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.473707 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00"} Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.473739 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3b77aca20aa599d5e6058fa1eee8838586d60965cde78f27e841c6b28da5be85"} Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.483068 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.492803 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.492843 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.492854 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.492870 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.492881 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:05Z","lastTransitionTime":"2026-02-28T04:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.494195 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.504833 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.513258 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.523069 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.531393 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.540982 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.553302 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.563562 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.576201 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.584931 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.593027 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.594366 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.594397 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.594407 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.594421 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.594431 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:05Z","lastTransitionTime":"2026-02-28T04:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.696469 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.696520 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.696529 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.696546 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.696556 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:05Z","lastTransitionTime":"2026-02-28T04:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.799343 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.799377 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.799386 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.799400 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.799409 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:05Z","lastTransitionTime":"2026-02-28T04:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.901207 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.901259 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.901268 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.901284 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:05 crc kubenswrapper[5072]: I0228 04:11:05.901293 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:05Z","lastTransitionTime":"2026-02-28T04:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.003352 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.003395 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.003406 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.003424 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.003435 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:06Z","lastTransitionTime":"2026-02-28T04:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.105365 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.105406 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.105416 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.105431 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.105442 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:06Z","lastTransitionTime":"2026-02-28T04:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.207397 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.207436 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.207453 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.207471 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.207480 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:06Z","lastTransitionTime":"2026-02-28T04:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.251954 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:06 crc kubenswrapper[5072]: E0228 04:11:06.252048 5072 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 04:11:06 crc kubenswrapper[5072]: E0228 04:11:06.252102 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 04:11:08.252087625 +0000 UTC m=+90.246817817 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.309599 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.309664 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.309676 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.309690 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.309700 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:06Z","lastTransitionTime":"2026-02-28T04:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.352630 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:11:06 crc kubenswrapper[5072]: E0228 04:11:06.352821 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:11:08.352790944 +0000 UTC m=+90.347521136 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.352891 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.352944 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.352986 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:06 crc kubenswrapper[5072]: E0228 04:11:06.353036 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 04:11:06 crc kubenswrapper[5072]: E0228 04:11:06.353055 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 04:11:06 crc kubenswrapper[5072]: E0228 04:11:06.353103 5072 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:11:06 crc kubenswrapper[5072]: E0228 04:11:06.353162 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-28 04:11:08.353148445 +0000 UTC m=+90.347878637 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:11:06 crc kubenswrapper[5072]: E0228 04:11:06.353177 5072 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 04:11:06 crc kubenswrapper[5072]: E0228 04:11:06.353191 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 04:11:06 crc kubenswrapper[5072]: E0228 04:11:06.353219 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 04:11:06 crc kubenswrapper[5072]: E0228 04:11:06.353232 5072 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:11:06 crc kubenswrapper[5072]: E0228 04:11:06.353254 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 04:11:08.353234297 +0000 UTC m=+90.347964489 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 04:11:06 crc kubenswrapper[5072]: E0228 04:11:06.353275 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-28 04:11:08.353264578 +0000 UTC m=+90.347994840 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.412277 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.412319 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.412328 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.412344 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.412352 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:06Z","lastTransitionTime":"2026-02-28T04:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.514494 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.514543 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.514555 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.514575 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.514587 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:06Z","lastTransitionTime":"2026-02-28T04:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.616746 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.616787 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.616798 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.616817 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.616829 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:06Z","lastTransitionTime":"2026-02-28T04:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.658913 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.658931 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.659004 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:06 crc kubenswrapper[5072]: E0228 04:11:06.659113 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:11:06 crc kubenswrapper[5072]: E0228 04:11:06.659238 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:11:06 crc kubenswrapper[5072]: E0228 04:11:06.659390 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.663423 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.663927 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.664696 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.665435 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.666695 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.667267 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.668362 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.668958 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.669924 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.670436 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.670930 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.671939 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.672409 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.673348 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.673880 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.674798 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.675391 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.675781 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.676789 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.677321 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.677762 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.678924 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.680930 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.685482 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.686465 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.688069 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.688917 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.689380 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.690370 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.690858 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.691785 5072 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.691892 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.693788 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.694764 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.695248 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.696826 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.697452 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.698441 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.699055 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.700108 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.700563 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.701503 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.702476 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.703090 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.703626 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.704821 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.705813 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.706593 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.707103 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.708056 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.708611 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.710071 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.711203 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.711853 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.720383 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.720441 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.720452 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.720471 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.720483 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:06Z","lastTransitionTime":"2026-02-28T04:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.822463 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.822507 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.822534 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.822558 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.822569 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:06Z","lastTransitionTime":"2026-02-28T04:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.924999 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.925039 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.925050 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.925064 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:06 crc kubenswrapper[5072]: I0228 04:11:06.925075 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:06Z","lastTransitionTime":"2026-02-28T04:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.027656 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.027710 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.027720 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.027736 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.027746 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:07Z","lastTransitionTime":"2026-02-28T04:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.129581 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.129620 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.129631 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.129668 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.129679 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:07Z","lastTransitionTime":"2026-02-28T04:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.231948 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.231983 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.231991 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.232019 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.232027 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:07Z","lastTransitionTime":"2026-02-28T04:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.333703 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.333742 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.333752 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.333769 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.333781 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:07Z","lastTransitionTime":"2026-02-28T04:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.436083 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.436123 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.436132 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.436148 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.436159 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:07Z","lastTransitionTime":"2026-02-28T04:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.486634 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b"} Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.499334 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:07Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.512927 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:07Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.522976 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:07Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.533538 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:07Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.537909 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.537949 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.537959 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.537971 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.537980 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:07Z","lastTransitionTime":"2026-02-28T04:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.550900 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:07Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.562348 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:07Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.640983 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.641032 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.641043 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.641062 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.641074 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:07Z","lastTransitionTime":"2026-02-28T04:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.742748 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.742791 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.742800 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.742813 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.742823 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:07Z","lastTransitionTime":"2026-02-28T04:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.845472 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.845511 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.845524 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.845542 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.845556 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:07Z","lastTransitionTime":"2026-02-28T04:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.947997 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.948059 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.948068 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.948082 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:07 crc kubenswrapper[5072]: I0228 04:11:07.948092 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:07Z","lastTransitionTime":"2026-02-28T04:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.050114 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.050167 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.050184 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.050205 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.050220 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:08Z","lastTransitionTime":"2026-02-28T04:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.152574 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.152615 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.152623 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.152638 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.152659 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:08Z","lastTransitionTime":"2026-02-28T04:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.254553 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.254603 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.254623 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.254674 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.254699 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:08Z","lastTransitionTime":"2026-02-28T04:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.268988 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:08 crc kubenswrapper[5072]: E0228 04:11:08.269084 5072 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 04:11:08 crc kubenswrapper[5072]: E0228 04:11:08.269145 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 04:11:12.269129824 +0000 UTC m=+94.263860026 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.356783 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.356835 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.356849 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.356867 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.356878 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:08Z","lastTransitionTime":"2026-02-28T04:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.370217 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.370289 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.370324 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.370351 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:08 crc kubenswrapper[5072]: E0228 04:11:08.370420 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:11:12.370361338 +0000 UTC m=+94.365091530 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:11:08 crc kubenswrapper[5072]: E0228 04:11:08.370459 5072 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 04:11:08 crc kubenswrapper[5072]: E0228 04:11:08.370495 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 04:11:08 crc kubenswrapper[5072]: E0228 04:11:08.370499 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 04:11:08 crc kubenswrapper[5072]: E0228 04:11:08.370508 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 04:11:12.370495872 +0000 UTC m=+94.365226064 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 04:11:08 crc kubenswrapper[5072]: E0228 04:11:08.370508 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 04:11:08 crc kubenswrapper[5072]: E0228 04:11:08.370515 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 04:11:08 crc kubenswrapper[5072]: E0228 04:11:08.370524 5072 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:11:08 crc kubenswrapper[5072]: E0228 04:11:08.370527 5072 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:11:08 crc kubenswrapper[5072]: E0228 04:11:08.370555 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-28 04:11:12.370548134 +0000 UTC m=+94.365278326 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:11:08 crc kubenswrapper[5072]: E0228 04:11:08.370566 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-28 04:11:12.370561834 +0000 UTC m=+94.365292026 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.459031 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.459070 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.459078 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.459094 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.459103 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:08Z","lastTransitionTime":"2026-02-28T04:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.564570 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.564608 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.564616 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.564629 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.564638 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:08Z","lastTransitionTime":"2026-02-28T04:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.658837 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.658956 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:08 crc kubenswrapper[5072]: E0228 04:11:08.659034 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:11:08 crc kubenswrapper[5072]: E0228 04:11:08.658957 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.658837 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:08 crc kubenswrapper[5072]: E0228 04:11:08.659197 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.666182 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.666205 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.666214 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.666228 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.666238 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:08Z","lastTransitionTime":"2026-02-28T04:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.671606 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.685713 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.701151 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.715119 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.730824 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.746773 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.768075 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.768119 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.768131 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.768147 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.768157 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:08Z","lastTransitionTime":"2026-02-28T04:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.870333 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.870370 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.870380 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.870394 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.870406 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:08Z","lastTransitionTime":"2026-02-28T04:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.972788 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.972830 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.972842 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.972859 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:08 crc kubenswrapper[5072]: I0228 04:11:08.972872 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:08Z","lastTransitionTime":"2026-02-28T04:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.075091 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.075146 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.075157 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.075174 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.075188 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:09Z","lastTransitionTime":"2026-02-28T04:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.178137 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.178227 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.178252 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.178282 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.178307 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:09Z","lastTransitionTime":"2026-02-28T04:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.280486 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.280539 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.280552 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.280570 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.280582 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:09Z","lastTransitionTime":"2026-02-28T04:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.382986 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.383021 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.383030 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.383056 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.383066 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:09Z","lastTransitionTime":"2026-02-28T04:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.485307 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.485588 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.485615 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.485719 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.485747 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:09Z","lastTransitionTime":"2026-02-28T04:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.539387 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.539518 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.539540 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.539566 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.539584 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:09Z","lastTransitionTime":"2026-02-28T04:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:09 crc kubenswrapper[5072]: E0228 04:11:09.558692 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:09Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.569262 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.569678 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.569877 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.570162 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.570553 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:09Z","lastTransitionTime":"2026-02-28T04:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:09 crc kubenswrapper[5072]: E0228 04:11:09.598320 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:09Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.604224 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.604255 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.604266 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.604282 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.604296 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:09Z","lastTransitionTime":"2026-02-28T04:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:09 crc kubenswrapper[5072]: E0228 04:11:09.618462 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:09Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.623768 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.623814 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.623823 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.623838 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.623848 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:09Z","lastTransitionTime":"2026-02-28T04:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:09 crc kubenswrapper[5072]: E0228 04:11:09.636136 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:09Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.640235 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.640271 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.640282 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.640337 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.640348 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:09Z","lastTransitionTime":"2026-02-28T04:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:09 crc kubenswrapper[5072]: E0228 04:11:09.654806 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:09Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:09 crc kubenswrapper[5072]: E0228 04:11:09.654944 5072 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.656398 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.656436 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.656448 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.656469 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.656483 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:09Z","lastTransitionTime":"2026-02-28T04:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.758508 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.758547 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.758556 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.758569 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.758578 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:09Z","lastTransitionTime":"2026-02-28T04:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.861605 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.861682 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.861695 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.861714 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.861727 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:09Z","lastTransitionTime":"2026-02-28T04:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.964424 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.964477 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.964492 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.964517 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:09 crc kubenswrapper[5072]: I0228 04:11:09.964534 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:09Z","lastTransitionTime":"2026-02-28T04:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.067269 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.067311 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.067325 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.067347 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.067363 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:10Z","lastTransitionTime":"2026-02-28T04:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.169864 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.169923 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.169937 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.169954 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.169965 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:10Z","lastTransitionTime":"2026-02-28T04:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.272419 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.272708 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.272792 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.272891 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.272972 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:10Z","lastTransitionTime":"2026-02-28T04:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.375430 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.375486 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.375497 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.375515 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.375529 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:10Z","lastTransitionTime":"2026-02-28T04:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.477906 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.477942 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.477952 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.477968 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.477978 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:10Z","lastTransitionTime":"2026-02-28T04:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.580513 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.580779 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.580862 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.580943 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.581053 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:10Z","lastTransitionTime":"2026-02-28T04:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.658629 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.658694 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.658736 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:10 crc kubenswrapper[5072]: E0228 04:11:10.658789 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:11:10 crc kubenswrapper[5072]: E0228 04:11:10.658876 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:11:10 crc kubenswrapper[5072]: E0228 04:11:10.658990 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.684018 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.684073 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.684085 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.684100 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.684112 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:10Z","lastTransitionTime":"2026-02-28T04:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.786157 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.786691 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.786758 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.786824 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.786893 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:10Z","lastTransitionTime":"2026-02-28T04:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.889571 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.889607 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.889616 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.889659 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.889672 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:10Z","lastTransitionTime":"2026-02-28T04:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.992096 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.992161 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.992170 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.992184 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:10 crc kubenswrapper[5072]: I0228 04:11:10.992194 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:10Z","lastTransitionTime":"2026-02-28T04:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.094650 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.094690 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.094719 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.094735 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.094745 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:11Z","lastTransitionTime":"2026-02-28T04:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.196477 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.196729 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.196795 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.196859 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.196915 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:11Z","lastTransitionTime":"2026-02-28T04:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.299190 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.299243 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.299254 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.299270 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.299287 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:11Z","lastTransitionTime":"2026-02-28T04:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.401560 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.401630 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.401690 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.401721 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.401739 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:11Z","lastTransitionTime":"2026-02-28T04:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.503502 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.503542 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.503550 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.503563 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.503572 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:11Z","lastTransitionTime":"2026-02-28T04:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.605820 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.605875 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.605886 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.605905 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.605914 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:11Z","lastTransitionTime":"2026-02-28T04:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.708123 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.708154 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.708163 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.708175 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.708184 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:11Z","lastTransitionTime":"2026-02-28T04:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.810741 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.810791 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.810808 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.810830 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.810843 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:11Z","lastTransitionTime":"2026-02-28T04:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.913542 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.913580 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.913595 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.913615 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:11 crc kubenswrapper[5072]: I0228 04:11:11.913629 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:11Z","lastTransitionTime":"2026-02-28T04:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.016182 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.016211 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.016220 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.016232 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.016241 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:12Z","lastTransitionTime":"2026-02-28T04:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.118819 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.118864 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.118874 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.118889 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.118899 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:12Z","lastTransitionTime":"2026-02-28T04:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.221213 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.221272 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.221289 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.221313 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.221340 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:12Z","lastTransitionTime":"2026-02-28T04:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.303444 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:12 crc kubenswrapper[5072]: E0228 04:11:12.303549 5072 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 04:11:12 crc kubenswrapper[5072]: E0228 04:11:12.303632 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 04:11:20.303607414 +0000 UTC m=+102.298337616 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.323756 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.323794 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.323806 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.323824 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.323836 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:12Z","lastTransitionTime":"2026-02-28T04:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.404416 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:11:12 crc kubenswrapper[5072]: E0228 04:11:12.404561 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:11:20.404532269 +0000 UTC m=+102.399262491 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.404605 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.404701 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.404757 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:12 crc kubenswrapper[5072]: E0228 04:11:12.404884 5072 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 04:11:12 crc kubenswrapper[5072]: E0228 04:11:12.404898 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 04:11:12 crc kubenswrapper[5072]: E0228 04:11:12.404921 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 04:11:12 crc kubenswrapper[5072]: E0228 04:11:12.404938 5072 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:11:12 crc kubenswrapper[5072]: E0228 04:11:12.404967 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 04:11:20.404944442 +0000 UTC m=+102.399674654 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 04:11:12 crc kubenswrapper[5072]: E0228 04:11:12.404993 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-28 04:11:20.404979263 +0000 UTC m=+102.399709475 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:11:12 crc kubenswrapper[5072]: E0228 04:11:12.405607 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 04:11:12 crc kubenswrapper[5072]: E0228 04:11:12.405697 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 04:11:12 crc kubenswrapper[5072]: E0228 04:11:12.405724 5072 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:11:12 crc kubenswrapper[5072]: E0228 04:11:12.405798 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-28 04:11:20.405780037 +0000 UTC m=+102.400510239 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.426258 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.426320 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.426336 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.426381 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.426400 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:12Z","lastTransitionTime":"2026-02-28T04:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.528601 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.528714 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.528729 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.528752 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.528768 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:12Z","lastTransitionTime":"2026-02-28T04:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.631193 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.631225 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.631233 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.631245 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.631254 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:12Z","lastTransitionTime":"2026-02-28T04:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.658702 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.658725 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.658817 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:12 crc kubenswrapper[5072]: E0228 04:11:12.659353 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:11:12 crc kubenswrapper[5072]: E0228 04:11:12.659470 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:11:12 crc kubenswrapper[5072]: E0228 04:11:12.659522 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.675587 5072 scope.go:117] "RemoveContainer" containerID="16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9" Feb 28 04:11:12 crc kubenswrapper[5072]: E0228 04:11:12.675774 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.677123 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.733663 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.733711 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.733743 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.733800 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.733816 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:12Z","lastTransitionTime":"2026-02-28T04:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.836491 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.836981 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.837090 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.837169 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.837244 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:12Z","lastTransitionTime":"2026-02-28T04:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.939627 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.939686 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.939696 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.939714 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:12 crc kubenswrapper[5072]: I0228 04:11:12.939723 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:12Z","lastTransitionTime":"2026-02-28T04:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.042253 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.042313 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.042329 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.042353 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.042376 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:13Z","lastTransitionTime":"2026-02-28T04:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.145137 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.145358 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.145436 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.145526 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.145614 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:13Z","lastTransitionTime":"2026-02-28T04:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.247598 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.247659 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.247675 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.247692 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.247704 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:13Z","lastTransitionTime":"2026-02-28T04:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.350036 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.350081 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.350093 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.350108 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.350121 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:13Z","lastTransitionTime":"2026-02-28T04:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.451827 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.451891 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.451913 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.451940 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.451961 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:13Z","lastTransitionTime":"2026-02-28T04:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.500766 5072 scope.go:117] "RemoveContainer" containerID="16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9" Feb 28 04:11:13 crc kubenswrapper[5072]: E0228 04:11:13.501161 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.554170 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.554227 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.554245 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.554269 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.554286 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:13Z","lastTransitionTime":"2026-02-28T04:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.656800 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.657007 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.657100 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.657168 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.657224 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:13Z","lastTransitionTime":"2026-02-28T04:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.758948 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.759221 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.759315 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.759389 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.759450 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:13Z","lastTransitionTime":"2026-02-28T04:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.861961 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.862014 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.862026 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.862055 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.862079 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:13Z","lastTransitionTime":"2026-02-28T04:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.963802 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.964042 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.964111 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.964187 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:13 crc kubenswrapper[5072]: I0228 04:11:13.964257 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:13Z","lastTransitionTime":"2026-02-28T04:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.066399 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.066680 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.066759 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.066843 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.066906 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:14Z","lastTransitionTime":"2026-02-28T04:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.168521 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.168789 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.168877 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.168963 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.169035 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:14Z","lastTransitionTime":"2026-02-28T04:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.270891 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.271151 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.271277 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.271372 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.271456 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:14Z","lastTransitionTime":"2026-02-28T04:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.373863 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.373903 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.373915 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.373933 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.373944 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:14Z","lastTransitionTime":"2026-02-28T04:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.475868 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.476273 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.476531 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.476709 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.476931 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:14Z","lastTransitionTime":"2026-02-28T04:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.579200 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.579448 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.579531 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.579624 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.579929 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:14Z","lastTransitionTime":"2026-02-28T04:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.658329 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.658333 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:14 crc kubenswrapper[5072]: E0228 04:11:14.658685 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:11:14 crc kubenswrapper[5072]: E0228 04:11:14.658471 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.658356 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:14 crc kubenswrapper[5072]: E0228 04:11:14.658837 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.682003 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.682052 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.682064 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.682083 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.682096 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:14Z","lastTransitionTime":"2026-02-28T04:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.784470 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.784518 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.784530 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.784550 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.784563 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:14Z","lastTransitionTime":"2026-02-28T04:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.886393 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.886430 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.886438 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.886453 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.886461 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:14Z","lastTransitionTime":"2026-02-28T04:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.991456 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.991493 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.991505 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.991699 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:14 crc kubenswrapper[5072]: I0228 04:11:14.991712 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:14Z","lastTransitionTime":"2026-02-28T04:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.093477 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.093507 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.093515 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.093527 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.093537 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:15Z","lastTransitionTime":"2026-02-28T04:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.195675 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.195706 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.195715 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.195728 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.195737 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:15Z","lastTransitionTime":"2026-02-28T04:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.298328 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.298366 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.298376 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.298389 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.298399 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:15Z","lastTransitionTime":"2026-02-28T04:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.400361 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.400604 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.400767 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.400866 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.400925 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:15Z","lastTransitionTime":"2026-02-28T04:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.504631 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.504708 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.504727 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.504751 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.504771 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:15Z","lastTransitionTime":"2026-02-28T04:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.608413 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.608811 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.608947 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.609079 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.609196 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:15Z","lastTransitionTime":"2026-02-28T04:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.712615 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.712926 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.713136 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.713319 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.713506 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:15Z","lastTransitionTime":"2026-02-28T04:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.816596 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.816675 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.816685 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.816703 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.816713 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:15Z","lastTransitionTime":"2026-02-28T04:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.919877 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.920984 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.921224 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.921439 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:15 crc kubenswrapper[5072]: I0228 04:11:15.921735 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:15Z","lastTransitionTime":"2026-02-28T04:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.024573 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.024965 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.025110 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.025331 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.025509 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:16Z","lastTransitionTime":"2026-02-28T04:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.129031 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.129399 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.129612 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.129800 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.129939 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:16Z","lastTransitionTime":"2026-02-28T04:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.232734 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.232789 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.232802 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.232820 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.232834 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:16Z","lastTransitionTime":"2026-02-28T04:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.335211 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.335467 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.335585 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.335724 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.335843 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:16Z","lastTransitionTime":"2026-02-28T04:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.438432 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.438507 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.438527 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.438551 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.438569 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:16Z","lastTransitionTime":"2026-02-28T04:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.541344 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.541629 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.541747 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.541838 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.541926 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:16Z","lastTransitionTime":"2026-02-28T04:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.643770 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.643802 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.643811 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.643824 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.643833 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:16Z","lastTransitionTime":"2026-02-28T04:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.659001 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.659017 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:16 crc kubenswrapper[5072]: E0228 04:11:16.659148 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.659177 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:16 crc kubenswrapper[5072]: E0228 04:11:16.659269 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:11:16 crc kubenswrapper[5072]: E0228 04:11:16.659340 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.746236 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.746453 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.746517 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.746616 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.746727 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:16Z","lastTransitionTime":"2026-02-28T04:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.848661 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.848697 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.848706 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.848723 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.848732 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:16Z","lastTransitionTime":"2026-02-28T04:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.951207 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.951313 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.951331 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.951786 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:16 crc kubenswrapper[5072]: I0228 04:11:16.951835 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:16Z","lastTransitionTime":"2026-02-28T04:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.054415 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.054461 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.054473 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.054493 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.054504 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:17Z","lastTransitionTime":"2026-02-28T04:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.156762 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.156818 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.156834 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.156885 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.156901 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:17Z","lastTransitionTime":"2026-02-28T04:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.259008 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.259068 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.259089 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.259113 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.259129 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:17Z","lastTransitionTime":"2026-02-28T04:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.361541 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.361583 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.361592 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.361606 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.361619 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:17Z","lastTransitionTime":"2026-02-28T04:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.463593 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.463667 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.463681 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.463699 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.463713 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:17Z","lastTransitionTime":"2026-02-28T04:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.565557 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.565604 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.565619 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.565657 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.565672 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:17Z","lastTransitionTime":"2026-02-28T04:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.667905 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.667948 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.667963 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.667983 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.668000 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:17Z","lastTransitionTime":"2026-02-28T04:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.771001 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.771041 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.771050 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.771065 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.771076 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:17Z","lastTransitionTime":"2026-02-28T04:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.873435 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.873501 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.873522 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.873551 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.873573 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:17Z","lastTransitionTime":"2026-02-28T04:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.976041 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.976119 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.976137 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.976164 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:17 crc kubenswrapper[5072]: I0228 04:11:17.976188 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:17Z","lastTransitionTime":"2026-02-28T04:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.078500 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.078626 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.078676 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.078712 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.078739 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:18Z","lastTransitionTime":"2026-02-28T04:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.180956 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.181022 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.181033 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.181045 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.181054 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:18Z","lastTransitionTime":"2026-02-28T04:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.283306 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.283368 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.283386 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.283407 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.283418 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:18Z","lastTransitionTime":"2026-02-28T04:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.386853 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.386939 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.386962 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.386991 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.387007 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:18Z","lastTransitionTime":"2026-02-28T04:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.492545 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.492603 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.492662 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.492686 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.492700 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:18Z","lastTransitionTime":"2026-02-28T04:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.595774 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.595803 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.595815 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.595829 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.595840 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:18Z","lastTransitionTime":"2026-02-28T04:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.658686 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.658808 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:18 crc kubenswrapper[5072]: E0228 04:11:18.658967 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.659011 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:18 crc kubenswrapper[5072]: E0228 04:11:18.659171 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:11:18 crc kubenswrapper[5072]: E0228 04:11:18.659024 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.670880 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.686138 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.698743 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.700060 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.700099 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.700109 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.700123 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.700132 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:18Z","lastTransitionTime":"2026-02-28T04:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.710284 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.724268 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.736588 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.748281 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.801583 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.801662 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.801673 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.801688 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.801697 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:18Z","lastTransitionTime":"2026-02-28T04:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.904331 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.904371 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.904380 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.904395 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:18 crc kubenswrapper[5072]: I0228 04:11:18.904406 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:18Z","lastTransitionTime":"2026-02-28T04:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.006377 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.006438 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.006456 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.006480 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.006497 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:19Z","lastTransitionTime":"2026-02-28T04:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.109068 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.109134 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.109156 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.109185 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.109208 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:19Z","lastTransitionTime":"2026-02-28T04:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.211664 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.211695 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.211704 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.211716 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.211725 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:19Z","lastTransitionTime":"2026-02-28T04:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.314723 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.314797 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.314811 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.314832 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.314848 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:19Z","lastTransitionTime":"2026-02-28T04:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.413552 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-vxs2k"] Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.414046 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vxs2k" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.415943 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.416453 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.416939 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.417408 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.417462 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.417486 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.417515 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.417540 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:19Z","lastTransitionTime":"2026-02-28T04:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.424301 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.438147 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.449789 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.460698 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.473190 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.487971 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.498998 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.512769 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.520193 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.520262 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.520276 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.520293 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.520306 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:19Z","lastTransitionTime":"2026-02-28T04:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.568843 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2d35c1a3-39ec-4e00-9d3f-5d1934701d44-hosts-file\") pod \"node-resolver-vxs2k\" (UID: \"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\") " pod="openshift-dns/node-resolver-vxs2k" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.569244 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2gvq\" (UniqueName: \"kubernetes.io/projected/2d35c1a3-39ec-4e00-9d3f-5d1934701d44-kube-api-access-w2gvq\") pod \"node-resolver-vxs2k\" (UID: \"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\") " pod="openshift-dns/node-resolver-vxs2k" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.623007 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.623049 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.623058 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.623074 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.623085 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:19Z","lastTransitionTime":"2026-02-28T04:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.669895 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2d35c1a3-39ec-4e00-9d3f-5d1934701d44-hosts-file\") pod \"node-resolver-vxs2k\" (UID: \"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\") " pod="openshift-dns/node-resolver-vxs2k" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.669990 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2gvq\" (UniqueName: \"kubernetes.io/projected/2d35c1a3-39ec-4e00-9d3f-5d1934701d44-kube-api-access-w2gvq\") pod \"node-resolver-vxs2k\" (UID: \"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\") " pod="openshift-dns/node-resolver-vxs2k" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.670029 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2d35c1a3-39ec-4e00-9d3f-5d1934701d44-hosts-file\") pod \"node-resolver-vxs2k\" (UID: \"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\") " pod="openshift-dns/node-resolver-vxs2k" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.690118 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2gvq\" (UniqueName: \"kubernetes.io/projected/2d35c1a3-39ec-4e00-9d3f-5d1934701d44-kube-api-access-w2gvq\") pod \"node-resolver-vxs2k\" (UID: \"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\") " pod="openshift-dns/node-resolver-vxs2k" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.726113 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.726178 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.726193 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.726212 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.726225 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:19Z","lastTransitionTime":"2026-02-28T04:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.738441 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vxs2k" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.787071 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-5lrpf"] Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.787720 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-n6jpz"] Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.788194 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.788588 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-8pz98"] Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.788828 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.789292 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.790761 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.792123 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.792660 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.793325 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.793546 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.794167 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.794331 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.794379 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.794486 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.794573 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.794668 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.796943 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.810959 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.823085 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.828041 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.828095 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.828106 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.828123 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.828134 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:19Z","lastTransitionTime":"2026-02-28T04:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.838579 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.855182 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.855248 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.855260 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.855276 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.855298 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:19Z","lastTransitionTime":"2026-02-28T04:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.855702 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.868030 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:19 crc kubenswrapper[5072]: E0228 04:11:19.868486 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.871506 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ae699423-376d-4342-bf44-7d70f68fadd1-cni-binary-copy\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.871744 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a035bbab-1d8f-4120-aaf7-88984d936939-proxy-tls\") pod \"machine-config-daemon-5lrpf\" (UID: \"a035bbab-1d8f-4120-aaf7-88984d936939\") " pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.871828 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/44befe72-7499-4d23-a09b-3d715817c3cb-cni-binary-copy\") pod \"multus-additional-cni-plugins-n6jpz\" (UID: \"44befe72-7499-4d23-a09b-3d715817c3cb\") " pod="openshift-multus/multus-additional-cni-plugins-n6jpz" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.871858 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-host-run-k8s-cni-cncf-io\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.871880 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-host-var-lib-cni-multus\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.871902 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-host-var-lib-kubelet\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.871927 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-multus-cni-dir\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.871986 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a035bbab-1d8f-4120-aaf7-88984d936939-rootfs\") pod \"machine-config-daemon-5lrpf\" (UID: \"a035bbab-1d8f-4120-aaf7-88984d936939\") " pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.872019 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/44befe72-7499-4d23-a09b-3d715817c3cb-system-cni-dir\") pod \"multus-additional-cni-plugins-n6jpz\" (UID: \"44befe72-7499-4d23-a09b-3d715817c3cb\") " pod="openshift-multus/multus-additional-cni-plugins-n6jpz" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.872099 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-host-var-lib-cni-bin\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.872147 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-multus-conf-dir\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.872170 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-etc-kubernetes\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.872192 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a035bbab-1d8f-4120-aaf7-88984d936939-mcd-auth-proxy-config\") pod \"machine-config-daemon-5lrpf\" (UID: \"a035bbab-1d8f-4120-aaf7-88984d936939\") " pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.872212 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/44befe72-7499-4d23-a09b-3d715817c3cb-cnibin\") pod \"multus-additional-cni-plugins-n6jpz\" (UID: \"44befe72-7499-4d23-a09b-3d715817c3cb\") " pod="openshift-multus/multus-additional-cni-plugins-n6jpz" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.872228 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/44befe72-7499-4d23-a09b-3d715817c3cb-os-release\") pod \"multus-additional-cni-plugins-n6jpz\" (UID: \"44befe72-7499-4d23-a09b-3d715817c3cb\") " pod="openshift-multus/multus-additional-cni-plugins-n6jpz" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.872243 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwhsk\" (UniqueName: \"kubernetes.io/projected/44befe72-7499-4d23-a09b-3d715817c3cb-kube-api-access-gwhsk\") pod \"multus-additional-cni-plugins-n6jpz\" (UID: \"44befe72-7499-4d23-a09b-3d715817c3cb\") " pod="openshift-multus/multus-additional-cni-plugins-n6jpz" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.872259 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-os-release\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.872293 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-system-cni-dir\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.872310 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-host-run-multus-certs\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.872334 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ae699423-376d-4342-bf44-7d70f68fadd1-multus-daemon-config\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.872397 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-hostroot\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.872416 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxlst\" (UniqueName: \"kubernetes.io/projected/a035bbab-1d8f-4120-aaf7-88984d936939-kube-api-access-wxlst\") pod \"machine-config-daemon-5lrpf\" (UID: \"a035bbab-1d8f-4120-aaf7-88984d936939\") " pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.872443 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-cnibin\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.872468 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-host-run-netns\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.872504 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/44befe72-7499-4d23-a09b-3d715817c3cb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n6jpz\" (UID: \"44befe72-7499-4d23-a09b-3d715817c3cb\") " pod="openshift-multus/multus-additional-cni-plugins-n6jpz" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.872525 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/44befe72-7499-4d23-a09b-3d715817c3cb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n6jpz\" (UID: \"44befe72-7499-4d23-a09b-3d715817c3cb\") " pod="openshift-multus/multus-additional-cni-plugins-n6jpz" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.872545 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-multus-socket-dir-parent\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.872569 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2tcs\" (UniqueName: \"kubernetes.io/projected/ae699423-376d-4342-bf44-7d70f68fadd1-kube-api-access-m2tcs\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.872702 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.872720 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.872732 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.872746 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.872765 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:19Z","lastTransitionTime":"2026-02-28T04:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.885980 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:19 crc kubenswrapper[5072]: E0228 04:11:19.894678 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.898950 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.898979 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.898990 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.899005 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.899017 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:19Z","lastTransitionTime":"2026-02-28T04:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.899255 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:19 crc kubenswrapper[5072]: E0228 04:11:19.911379 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.915084 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.916961 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.916996 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.917007 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.917359 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.917387 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:19Z","lastTransitionTime":"2026-02-28T04:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.930749 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:19 crc kubenswrapper[5072]: E0228 04:11:19.934397 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.939987 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.940151 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.940239 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.940333 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.940460 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:19Z","lastTransitionTime":"2026-02-28T04:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.943385 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:19 crc kubenswrapper[5072]: E0228 04:11:19.952302 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:19 crc kubenswrapper[5072]: E0228 04:11:19.952451 5072 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.954590 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.954615 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.954623 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.954652 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.954663 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:19Z","lastTransitionTime":"2026-02-28T04:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.955330 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.966927 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.972941 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-hostroot\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.972985 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ae699423-376d-4342-bf44-7d70f68fadd1-multus-daemon-config\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.973019 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxlst\" (UniqueName: \"kubernetes.io/projected/a035bbab-1d8f-4120-aaf7-88984d936939-kube-api-access-wxlst\") pod \"machine-config-daemon-5lrpf\" (UID: \"a035bbab-1d8f-4120-aaf7-88984d936939\") " pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.973038 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-cnibin\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.973056 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-host-run-netns\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.973074 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/44befe72-7499-4d23-a09b-3d715817c3cb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n6jpz\" (UID: \"44befe72-7499-4d23-a09b-3d715817c3cb\") " pod="openshift-multus/multus-additional-cni-plugins-n6jpz" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.973094 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/44befe72-7499-4d23-a09b-3d715817c3cb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n6jpz\" (UID: \"44befe72-7499-4d23-a09b-3d715817c3cb\") " pod="openshift-multus/multus-additional-cni-plugins-n6jpz" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.973115 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-multus-socket-dir-parent\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.973140 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2tcs\" (UniqueName: \"kubernetes.io/projected/ae699423-376d-4342-bf44-7d70f68fadd1-kube-api-access-m2tcs\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.973159 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a035bbab-1d8f-4120-aaf7-88984d936939-proxy-tls\") pod \"machine-config-daemon-5lrpf\" (UID: \"a035bbab-1d8f-4120-aaf7-88984d936939\") " pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.973177 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ae699423-376d-4342-bf44-7d70f68fadd1-cni-binary-copy\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.973198 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/44befe72-7499-4d23-a09b-3d715817c3cb-cni-binary-copy\") pod \"multus-additional-cni-plugins-n6jpz\" (UID: \"44befe72-7499-4d23-a09b-3d715817c3cb\") " pod="openshift-multus/multus-additional-cni-plugins-n6jpz" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.973215 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-host-run-k8s-cni-cncf-io\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.973231 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-host-var-lib-cni-multus\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.973249 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-host-var-lib-kubelet\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.973268 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-multus-cni-dir\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.973310 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a035bbab-1d8f-4120-aaf7-88984d936939-rootfs\") pod \"machine-config-daemon-5lrpf\" (UID: \"a035bbab-1d8f-4120-aaf7-88984d936939\") " pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.973327 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/44befe72-7499-4d23-a09b-3d715817c3cb-system-cni-dir\") pod \"multus-additional-cni-plugins-n6jpz\" (UID: \"44befe72-7499-4d23-a09b-3d715817c3cb\") " pod="openshift-multus/multus-additional-cni-plugins-n6jpz" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.973344 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-host-var-lib-cni-bin\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.973361 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-multus-conf-dir\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.973376 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-etc-kubernetes\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.973400 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a035bbab-1d8f-4120-aaf7-88984d936939-mcd-auth-proxy-config\") pod \"machine-config-daemon-5lrpf\" (UID: \"a035bbab-1d8f-4120-aaf7-88984d936939\") " pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.973419 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/44befe72-7499-4d23-a09b-3d715817c3cb-cnibin\") pod \"multus-additional-cni-plugins-n6jpz\" (UID: \"44befe72-7499-4d23-a09b-3d715817c3cb\") " pod="openshift-multus/multus-additional-cni-plugins-n6jpz" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.973437 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/44befe72-7499-4d23-a09b-3d715817c3cb-os-release\") pod \"multus-additional-cni-plugins-n6jpz\" (UID: \"44befe72-7499-4d23-a09b-3d715817c3cb\") " pod="openshift-multus/multus-additional-cni-plugins-n6jpz" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.973453 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwhsk\" (UniqueName: \"kubernetes.io/projected/44befe72-7499-4d23-a09b-3d715817c3cb-kube-api-access-gwhsk\") pod \"multus-additional-cni-plugins-n6jpz\" (UID: \"44befe72-7499-4d23-a09b-3d715817c3cb\") " pod="openshift-multus/multus-additional-cni-plugins-n6jpz" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.973474 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-os-release\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.973488 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-system-cni-dir\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.973506 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-host-run-multus-certs\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.973610 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-host-var-lib-cni-multus\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.973768 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-hostroot\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.974146 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-host-run-multus-certs\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.974198 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-host-var-lib-kubelet\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.974414 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-cnibin\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.974446 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-host-run-netns\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.974516 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a035bbab-1d8f-4120-aaf7-88984d936939-rootfs\") pod \"machine-config-daemon-5lrpf\" (UID: \"a035bbab-1d8f-4120-aaf7-88984d936939\") " pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.974704 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/44befe72-7499-4d23-a09b-3d715817c3cb-system-cni-dir\") pod \"multus-additional-cni-plugins-n6jpz\" (UID: \"44befe72-7499-4d23-a09b-3d715817c3cb\") " pod="openshift-multus/multus-additional-cni-plugins-n6jpz" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.974831 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-host-var-lib-cni-bin\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.974880 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-multus-conf-dir\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.974958 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-etc-kubernetes\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.975249 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/44befe72-7499-4d23-a09b-3d715817c3cb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n6jpz\" (UID: \"44befe72-7499-4d23-a09b-3d715817c3cb\") " pod="openshift-multus/multus-additional-cni-plugins-n6jpz" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.975260 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-os-release\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.975297 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-host-run-k8s-cni-cncf-io\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.975341 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-system-cni-dir\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.975367 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/44befe72-7499-4d23-a09b-3d715817c3cb-cnibin\") pod \"multus-additional-cni-plugins-n6jpz\" (UID: \"44befe72-7499-4d23-a09b-3d715817c3cb\") " pod="openshift-multus/multus-additional-cni-plugins-n6jpz" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.975445 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/44befe72-7499-4d23-a09b-3d715817c3cb-os-release\") pod \"multus-additional-cni-plugins-n6jpz\" (UID: \"44befe72-7499-4d23-a09b-3d715817c3cb\") " pod="openshift-multus/multus-additional-cni-plugins-n6jpz" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.975950 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/44befe72-7499-4d23-a09b-3d715817c3cb-cni-binary-copy\") pod \"multus-additional-cni-plugins-n6jpz\" (UID: \"44befe72-7499-4d23-a09b-3d715817c3cb\") " pod="openshift-multus/multus-additional-cni-plugins-n6jpz" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.976359 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-multus-socket-dir-parent\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.976539 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ae699423-376d-4342-bf44-7d70f68fadd1-cni-binary-copy\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.977078 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a035bbab-1d8f-4120-aaf7-88984d936939-mcd-auth-proxy-config\") pod \"machine-config-daemon-5lrpf\" (UID: \"a035bbab-1d8f-4120-aaf7-88984d936939\") " pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.980872 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ae699423-376d-4342-bf44-7d70f68fadd1-multus-cni-dir\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.981437 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a035bbab-1d8f-4120-aaf7-88984d936939-proxy-tls\") pod \"machine-config-daemon-5lrpf\" (UID: \"a035bbab-1d8f-4120-aaf7-88984d936939\") " pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.981473 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ae699423-376d-4342-bf44-7d70f68fadd1-multus-daemon-config\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.982051 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.990492 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/44befe72-7499-4d23-a09b-3d715817c3cb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n6jpz\" (UID: \"44befe72-7499-4d23-a09b-3d715817c3cb\") " pod="openshift-multus/multus-additional-cni-plugins-n6jpz" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.991501 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwhsk\" (UniqueName: \"kubernetes.io/projected/44befe72-7499-4d23-a09b-3d715817c3cb-kube-api-access-gwhsk\") pod \"multus-additional-cni-plugins-n6jpz\" (UID: \"44befe72-7499-4d23-a09b-3d715817c3cb\") " pod="openshift-multus/multus-additional-cni-plugins-n6jpz" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.995118 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2tcs\" (UniqueName: \"kubernetes.io/projected/ae699423-376d-4342-bf44-7d70f68fadd1-kube-api-access-m2tcs\") pod \"multus-8pz98\" (UID: \"ae699423-376d-4342-bf44-7d70f68fadd1\") " pod="openshift-multus/multus-8pz98" Feb 28 04:11:19 crc kubenswrapper[5072]: I0228 04:11:19.995706 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.000821 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxlst\" (UniqueName: \"kubernetes.io/projected/a035bbab-1d8f-4120-aaf7-88984d936939-kube-api-access-wxlst\") pod \"machine-config-daemon-5lrpf\" (UID: \"a035bbab-1d8f-4120-aaf7-88984d936939\") " pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.011891 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.023662 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.035438 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.047625 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.056738 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.057376 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.057430 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.057441 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.057479 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.057490 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:20Z","lastTransitionTime":"2026-02-28T04:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.067880 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.104558 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.112798 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.118696 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8pz98" Feb 28 04:11:20 crc kubenswrapper[5072]: W0228 04:11:20.131678 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44befe72_7499_4d23_a09b_3d715817c3cb.slice/crio-ad6b9e03103731cdfc75c906a3a0494360e77aa02f19737d9beaec1bf7b8377e WatchSource:0}: Error finding container ad6b9e03103731cdfc75c906a3a0494360e77aa02f19737d9beaec1bf7b8377e: Status 404 returned error can't find the container with id ad6b9e03103731cdfc75c906a3a0494360e77aa02f19737d9beaec1bf7b8377e Feb 28 04:11:20 crc kubenswrapper[5072]: W0228 04:11:20.132418 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae699423_376d_4342_bf44_7d70f68fadd1.slice/crio-6612b5a3060b5f57ebaa43c1611c6bba8000003caba75904f04b742e4d10f27e WatchSource:0}: Error finding container 6612b5a3060b5f57ebaa43c1611c6bba8000003caba75904f04b742e4d10f27e: Status 404 returned error can't find the container with id 6612b5a3060b5f57ebaa43c1611c6bba8000003caba75904f04b742e4d10f27e Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.154598 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kfpqp"] Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.155721 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.157963 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.158154 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.158298 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.158871 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.159061 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.159114 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.159290 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.160113 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.160145 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.160158 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.160176 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.160188 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:20Z","lastTransitionTime":"2026-02-28T04:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.170352 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.192660 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.204371 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.215207 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.227146 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.270810 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.276500 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-cni-bin\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.276546 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.276563 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-etc-openvswitch\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.276577 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-log-socket\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.276593 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/043491df-2577-47f6-9a5b-03fecada16ce-ovnkube-config\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.276606 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/043491df-2577-47f6-9a5b-03fecada16ce-env-overrides\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.276629 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-kubelet\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.276673 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-run-systemd\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.276692 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-run-ovn-kubernetes\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.276710 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/043491df-2577-47f6-9a5b-03fecada16ce-ovn-node-metrics-cert\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.276730 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-slash\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.276747 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-var-lib-openvswitch\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.276765 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-run-ovn\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.276798 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-run-openvswitch\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.276829 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-systemd-units\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.276844 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-node-log\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.276859 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-cni-netd\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.276878 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvpck\" (UniqueName: \"kubernetes.io/projected/043491df-2577-47f6-9a5b-03fecada16ce-kube-api-access-pvpck\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.276897 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-run-netns\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.276918 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/043491df-2577-47f6-9a5b-03fecada16ce-ovnkube-script-lib\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.284976 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.285013 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.285025 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.285039 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.285052 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:20Z","lastTransitionTime":"2026-02-28T04:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.298161 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.311919 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.323038 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.331934 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.342985 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.354155 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.377909 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.377948 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-etc-openvswitch\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.377963 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-log-socket\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.377977 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/043491df-2577-47f6-9a5b-03fecada16ce-ovnkube-config\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.377992 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/043491df-2577-47f6-9a5b-03fecada16ce-env-overrides\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.378016 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-kubelet\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.378031 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-run-systemd\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.378075 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-run-ovn-kubernetes\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.378089 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/043491df-2577-47f6-9a5b-03fecada16ce-ovn-node-metrics-cert\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.378103 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-slash\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.378120 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-var-lib-openvswitch\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.378147 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-run-ovn\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.378163 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-run-openvswitch\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.378180 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.378197 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-systemd-units\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.378212 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-node-log\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.378229 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-cni-netd\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.378251 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvpck\" (UniqueName: \"kubernetes.io/projected/043491df-2577-47f6-9a5b-03fecada16ce-kube-api-access-pvpck\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.378266 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-run-netns\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.378280 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/043491df-2577-47f6-9a5b-03fecada16ce-ovnkube-script-lib\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.378294 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-cni-bin\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.378343 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-cni-bin\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.378375 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.378394 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-etc-openvswitch\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.378413 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-log-socket\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.378816 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-run-ovn\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.378953 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/043491df-2577-47f6-9a5b-03fecada16ce-ovnkube-config\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.378991 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-run-openvswitch\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: E0228 04:11:20.379025 5072 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 04:11:20 crc kubenswrapper[5072]: E0228 04:11:20.379059 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 04:11:36.379047989 +0000 UTC m=+118.373778181 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.379253 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-kubelet\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.379304 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-run-systemd\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.379258 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-slash\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.379340 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-run-netns\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.379380 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/043491df-2577-47f6-9a5b-03fecada16ce-env-overrides\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.379398 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-systemd-units\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.379400 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-node-log\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.379384 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-var-lib-openvswitch\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.379457 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-cni-netd\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.379728 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-run-ovn-kubernetes\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.379828 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/043491df-2577-47f6-9a5b-03fecada16ce-ovnkube-script-lib\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.384461 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/043491df-2577-47f6-9a5b-03fecada16ce-ovn-node-metrics-cert\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.386632 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.386667 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.386675 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.386688 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.386699 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:20Z","lastTransitionTime":"2026-02-28T04:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.397010 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvpck\" (UniqueName: \"kubernetes.io/projected/043491df-2577-47f6-9a5b-03fecada16ce-kube-api-access-pvpck\") pod \"ovnkube-node-kfpqp\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.479000 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.479092 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.479136 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:20 crc kubenswrapper[5072]: E0228 04:11:20.479185 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:11:36.4791625 +0000 UTC m=+118.473892702 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.479216 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:20 crc kubenswrapper[5072]: E0228 04:11:20.479260 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 04:11:20 crc kubenswrapper[5072]: E0228 04:11:20.479276 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 04:11:20 crc kubenswrapper[5072]: E0228 04:11:20.479285 5072 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:11:20 crc kubenswrapper[5072]: E0228 04:11:20.479299 5072 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 04:11:20 crc kubenswrapper[5072]: E0228 04:11:20.479314 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-28 04:11:36.479306814 +0000 UTC m=+118.474037006 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:11:20 crc kubenswrapper[5072]: E0228 04:11:20.479332 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 04:11:36.479322865 +0000 UTC m=+118.474053047 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 04:11:20 crc kubenswrapper[5072]: E0228 04:11:20.479383 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 04:11:20 crc kubenswrapper[5072]: E0228 04:11:20.479397 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 04:11:20 crc kubenswrapper[5072]: E0228 04:11:20.479406 5072 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:11:20 crc kubenswrapper[5072]: E0228 04:11:20.479431 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-28 04:11:36.479423468 +0000 UTC m=+118.474153670 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.488303 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.488337 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.488345 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.488357 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.488366 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:20Z","lastTransitionTime":"2026-02-28T04:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.515474 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8pz98" event={"ID":"ae699423-376d-4342-bf44-7d70f68fadd1","Type":"ContainerStarted","Data":"b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7"} Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.515518 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8pz98" event={"ID":"ae699423-376d-4342-bf44-7d70f68fadd1","Type":"ContainerStarted","Data":"6612b5a3060b5f57ebaa43c1611c6bba8000003caba75904f04b742e4d10f27e"} Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.517019 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" event={"ID":"44befe72-7499-4d23-a09b-3d715817c3cb","Type":"ContainerStarted","Data":"3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1"} Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.517055 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" event={"ID":"44befe72-7499-4d23-a09b-3d715817c3cb","Type":"ContainerStarted","Data":"ad6b9e03103731cdfc75c906a3a0494360e77aa02f19737d9beaec1bf7b8377e"} Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.518556 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" event={"ID":"a035bbab-1d8f-4120-aaf7-88984d936939","Type":"ContainerStarted","Data":"5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503"} Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.518744 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" event={"ID":"a035bbab-1d8f-4120-aaf7-88984d936939","Type":"ContainerStarted","Data":"719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d"} Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.518764 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" event={"ID":"a035bbab-1d8f-4120-aaf7-88984d936939","Type":"ContainerStarted","Data":"5a185a69393b81e30d738ce9aee78bb80dfebe6f980290ae1ff642a47615846d"} Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.519844 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vxs2k" event={"ID":"2d35c1a3-39ec-4e00-9d3f-5d1934701d44","Type":"ContainerStarted","Data":"a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2"} Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.519899 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vxs2k" event={"ID":"2d35c1a3-39ec-4e00-9d3f-5d1934701d44","Type":"ContainerStarted","Data":"8e5e9bfb75610f7ba8f4ba6dccb4691e44997eb60651807bb04a70eed192222f"} Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.531912 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.541969 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.559153 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.571058 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.582515 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.588658 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.590370 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.590458 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.590833 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.590874 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.590893 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:20Z","lastTransitionTime":"2026-02-28T04:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.595136 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.607863 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.623453 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.635462 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.645007 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: W0228 04:11:20.646916 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod043491df_2577_47f6_9a5b_03fecada16ce.slice/crio-859207fea6a0d5434ba1a1de90bfddda710cea2d855120a8c045ab8b2aa9e1f9 WatchSource:0}: Error finding container 859207fea6a0d5434ba1a1de90bfddda710cea2d855120a8c045ab8b2aa9e1f9: Status 404 returned error can't find the container with id 859207fea6a0d5434ba1a1de90bfddda710cea2d855120a8c045ab8b2aa9e1f9 Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.655328 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.658812 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:20 crc kubenswrapper[5072]: E0228 04:11:20.658932 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.658812 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.658995 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:20 crc kubenswrapper[5072]: E0228 04:11:20.659046 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:11:20 crc kubenswrapper[5072]: E0228 04:11:20.659138 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.671461 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.686927 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.693906 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.693932 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.693940 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.693953 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.693962 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:20Z","lastTransitionTime":"2026-02-28T04:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.701273 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.712063 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.725142 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.741595 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.759996 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.772423 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.790330 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.797031 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.797066 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.797075 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.797090 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.797099 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:20Z","lastTransitionTime":"2026-02-28T04:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.806172 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.818733 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.833438 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.847003 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.899878 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.899927 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.899941 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.899959 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:20 crc kubenswrapper[5072]: I0228 04:11:20.899972 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:20Z","lastTransitionTime":"2026-02-28T04:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.002068 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.002115 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.002126 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.002143 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.002154 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:21Z","lastTransitionTime":"2026-02-28T04:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.104360 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.104397 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.104407 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.104421 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.104431 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:21Z","lastTransitionTime":"2026-02-28T04:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.206358 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.206577 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.206585 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.206598 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.206607 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:21Z","lastTransitionTime":"2026-02-28T04:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.308794 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.308848 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.308861 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.308877 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.308889 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:21Z","lastTransitionTime":"2026-02-28T04:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.411431 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.411489 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.411506 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.411530 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.411547 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:21Z","lastTransitionTime":"2026-02-28T04:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.514058 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.514123 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.514162 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.514194 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.514216 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:21Z","lastTransitionTime":"2026-02-28T04:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.532286 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" event={"ID":"043491df-2577-47f6-9a5b-03fecada16ce","Type":"ContainerDied","Data":"604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515"} Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.532382 5072 generic.go:334] "Generic (PLEG): container finished" podID="043491df-2577-47f6-9a5b-03fecada16ce" containerID="604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515" exitCode=0 Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.532508 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" event={"ID":"043491df-2577-47f6-9a5b-03fecada16ce","Type":"ContainerStarted","Data":"859207fea6a0d5434ba1a1de90bfddda710cea2d855120a8c045ab8b2aa9e1f9"} Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.535556 5072 generic.go:334] "Generic (PLEG): container finished" podID="44befe72-7499-4d23-a09b-3d715817c3cb" containerID="3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1" exitCode=0 Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.535621 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" event={"ID":"44befe72-7499-4d23-a09b-3d715817c3cb","Type":"ContainerDied","Data":"3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1"} Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.554900 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:21Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.592753 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:21Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.611976 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:21Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.616974 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.617006 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.617017 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.617031 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.617044 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:21Z","lastTransitionTime":"2026-02-28T04:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.624812 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:21Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.637176 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:21Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.649014 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:21Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.662481 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:21Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.675460 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:21Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.687558 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:21Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.700859 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:21Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.710262 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:21Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.720413 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.720480 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.720490 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.720504 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.720515 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:21Z","lastTransitionTime":"2026-02-28T04:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.720927 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:21Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.739925 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:21Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.755414 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:21Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.768726 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:21Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.783727 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:21Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.793938 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:21Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.806831 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:21Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.817765 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:21Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.821932 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.821961 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.821968 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.821982 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.821991 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:21Z","lastTransitionTime":"2026-02-28T04:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.828966 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:21Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.841180 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:21Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.853060 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:21Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.870726 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:21Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.897594 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:21Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.924280 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.924321 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.924333 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.924350 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:21 crc kubenswrapper[5072]: I0228 04:11:21.924362 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:21Z","lastTransitionTime":"2026-02-28T04:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.025837 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.025869 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.025878 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.025893 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.025902 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:22Z","lastTransitionTime":"2026-02-28T04:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.128080 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.128115 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.128124 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.128140 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.128150 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:22Z","lastTransitionTime":"2026-02-28T04:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.231206 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.231246 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.231257 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.231278 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.231291 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:22Z","lastTransitionTime":"2026-02-28T04:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.333807 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.333849 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.333863 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.333883 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.333899 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:22Z","lastTransitionTime":"2026-02-28T04:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.437108 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.437191 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.437213 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.437243 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.437267 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:22Z","lastTransitionTime":"2026-02-28T04:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.539156 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.539246 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.539292 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.539324 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.539343 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:22Z","lastTransitionTime":"2026-02-28T04:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.540446 5072 generic.go:334] "Generic (PLEG): container finished" podID="44befe72-7499-4d23-a09b-3d715817c3cb" containerID="dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075" exitCode=0 Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.540501 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" event={"ID":"44befe72-7499-4d23-a09b-3d715817c3cb","Type":"ContainerDied","Data":"dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075"} Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.545706 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" event={"ID":"043491df-2577-47f6-9a5b-03fecada16ce","Type":"ContainerStarted","Data":"84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e"} Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.545749 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" event={"ID":"043491df-2577-47f6-9a5b-03fecada16ce","Type":"ContainerStarted","Data":"2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906"} Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.545761 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" event={"ID":"043491df-2577-47f6-9a5b-03fecada16ce","Type":"ContainerStarted","Data":"7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8"} Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.545773 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" event={"ID":"043491df-2577-47f6-9a5b-03fecada16ce","Type":"ContainerStarted","Data":"91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0"} Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.545784 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" event={"ID":"043491df-2577-47f6-9a5b-03fecada16ce","Type":"ContainerStarted","Data":"4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456"} Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.545794 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" event={"ID":"043491df-2577-47f6-9a5b-03fecada16ce","Type":"ContainerStarted","Data":"71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e"} Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.570410 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:22Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.591681 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:22Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.614028 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:22Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.627185 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:22Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.639889 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:22Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.642272 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.642305 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.642315 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.642332 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.642342 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:22Z","lastTransitionTime":"2026-02-28T04:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.656750 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:22Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.658121 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.658188 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:22 crc kubenswrapper[5072]: E0228 04:11:22.658222 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.658258 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:22 crc kubenswrapper[5072]: E0228 04:11:22.658381 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:11:22 crc kubenswrapper[5072]: E0228 04:11:22.658474 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.673709 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:22Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.689872 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:22Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.700927 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:22Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.710479 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:22Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.720244 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:22Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.738728 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:22Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.747558 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.747598 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.747606 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.747622 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.747632 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:22Z","lastTransitionTime":"2026-02-28T04:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.849851 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.850158 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.850168 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.850182 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.850191 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:22Z","lastTransitionTime":"2026-02-28T04:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.952345 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.952381 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.952389 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.952402 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:22 crc kubenswrapper[5072]: I0228 04:11:22.952411 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:22Z","lastTransitionTime":"2026-02-28T04:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.054860 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.054889 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.054898 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.054912 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.054921 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:23Z","lastTransitionTime":"2026-02-28T04:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.157078 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.157110 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.157118 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.157132 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.157142 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:23Z","lastTransitionTime":"2026-02-28T04:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.260287 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.260325 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.260337 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.260354 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.260368 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:23Z","lastTransitionTime":"2026-02-28T04:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.362781 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.362824 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.362832 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.362847 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.362856 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:23Z","lastTransitionTime":"2026-02-28T04:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.466088 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.466134 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.466171 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.466194 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.466214 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:23Z","lastTransitionTime":"2026-02-28T04:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.551910 5072 generic.go:334] "Generic (PLEG): container finished" podID="44befe72-7499-4d23-a09b-3d715817c3cb" containerID="5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284" exitCode=0 Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.551996 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" event={"ID":"44befe72-7499-4d23-a09b-3d715817c3cb","Type":"ContainerDied","Data":"5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284"} Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.575114 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.575205 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.575235 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.575281 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.575310 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:23Z","lastTransitionTime":"2026-02-28T04:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.575097 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:23Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.596307 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:23Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.629608 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:23Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.646444 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:23Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.664542 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:23Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.685551 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:23Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.687446 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.687508 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.687523 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.687543 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.687556 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:23Z","lastTransitionTime":"2026-02-28T04:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.705429 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:23Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.724335 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:23Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.737769 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:23Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.753475 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:23Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.774155 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:23Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.789183 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:23Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.791187 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.791229 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.791246 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.791270 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.791286 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:23Z","lastTransitionTime":"2026-02-28T04:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.893005 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.893032 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.893041 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.893054 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.893063 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:23Z","lastTransitionTime":"2026-02-28T04:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.995729 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.995773 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.995784 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.995797 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:23 crc kubenswrapper[5072]: I0228 04:11:23.995808 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:23Z","lastTransitionTime":"2026-02-28T04:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.099411 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.099469 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.099482 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.099508 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.099526 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:24Z","lastTransitionTime":"2026-02-28T04:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.202264 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.202324 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.202407 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.202437 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.202458 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:24Z","lastTransitionTime":"2026-02-28T04:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.305452 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.305499 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.305511 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.305529 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.305540 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:24Z","lastTransitionTime":"2026-02-28T04:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.408579 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.408661 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.408679 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.408703 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.408716 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:24Z","lastTransitionTime":"2026-02-28T04:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.512179 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.512222 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.512232 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.512247 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.512257 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:24Z","lastTransitionTime":"2026-02-28T04:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.558472 5072 generic.go:334] "Generic (PLEG): container finished" podID="44befe72-7499-4d23-a09b-3d715817c3cb" containerID="6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19" exitCode=0 Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.558540 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" event={"ID":"44befe72-7499-4d23-a09b-3d715817c3cb","Type":"ContainerDied","Data":"6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19"} Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.563610 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" event={"ID":"043491df-2577-47f6-9a5b-03fecada16ce","Type":"ContainerStarted","Data":"1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7"} Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.577535 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:24Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.598781 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:24Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.614578 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.614623 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.614670 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.614692 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.614705 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:24Z","lastTransitionTime":"2026-02-28T04:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.623922 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:24Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.636957 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:24Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.654422 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:24Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.658720 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.658771 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.658818 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:24 crc kubenswrapper[5072]: E0228 04:11:24.658868 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:11:24 crc kubenswrapper[5072]: E0228 04:11:24.658964 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:11:24 crc kubenswrapper[5072]: E0228 04:11:24.659098 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.668259 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:24Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.682734 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:24Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.702537 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:24Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.718213 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:24Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.718263 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.718299 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.718316 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.718339 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.718363 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:24Z","lastTransitionTime":"2026-02-28T04:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.736767 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:24Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.751493 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:24Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.766924 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:24Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.824799 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.824849 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.824861 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.824886 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.824926 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:24Z","lastTransitionTime":"2026-02-28T04:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.929453 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.930296 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.930339 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.930383 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:24 crc kubenswrapper[5072]: I0228 04:11:24.930414 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:24Z","lastTransitionTime":"2026-02-28T04:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.033538 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.033583 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.033596 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.033614 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.033628 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:25Z","lastTransitionTime":"2026-02-28T04:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.137114 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.137182 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.137201 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.137228 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.137247 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:25Z","lastTransitionTime":"2026-02-28T04:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.240182 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.240234 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.240245 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.240264 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.240274 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:25Z","lastTransitionTime":"2026-02-28T04:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.342401 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.342613 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.342689 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.342781 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.342843 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:25Z","lastTransitionTime":"2026-02-28T04:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.445856 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.445902 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.445915 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.445932 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.445944 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:25Z","lastTransitionTime":"2026-02-28T04:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.548620 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.548679 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.548689 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.548704 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.548714 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:25Z","lastTransitionTime":"2026-02-28T04:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.605053 5072 generic.go:334] "Generic (PLEG): container finished" podID="44befe72-7499-4d23-a09b-3d715817c3cb" containerID="df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d" exitCode=0 Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.605103 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" event={"ID":"44befe72-7499-4d23-a09b-3d715817c3cb","Type":"ContainerDied","Data":"df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d"} Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.631190 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:25Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.640806 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:25Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.651915 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.651950 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.651960 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.651974 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.651984 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:25Z","lastTransitionTime":"2026-02-28T04:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.652246 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:25Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.664254 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:25Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.674166 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:25Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.686270 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:25Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.698651 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:25Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.709755 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:25Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.720417 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:25Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.735381 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:25Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.748791 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:25Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.753870 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.753970 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.754037 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.754111 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.754188 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:25Z","lastTransitionTime":"2026-02-28T04:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.761041 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:25Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.857966 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.858034 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.858050 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.858067 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:25 crc kubenswrapper[5072]: I0228 04:11:25.858100 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:25Z","lastTransitionTime":"2026-02-28T04:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:25.960656 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:25.960697 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:25.960706 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:25.960721 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:25.960730 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:25Z","lastTransitionTime":"2026-02-28T04:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.062702 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.062732 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.062741 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.062755 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.062765 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:26Z","lastTransitionTime":"2026-02-28T04:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.165127 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.165196 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.165208 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.165225 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.165235 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:26Z","lastTransitionTime":"2026-02-28T04:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.175890 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-sbbcr"] Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.176343 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sbbcr" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.177981 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.178413 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.178501 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.178574 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.188703 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:26Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.204940 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:26Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.213750 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sbbcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vlnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sbbcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:26Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.223475 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:26Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.235005 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:26Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.244556 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:26Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.258670 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:26Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.267122 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.267154 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.267163 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.267180 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.267190 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:26Z","lastTransitionTime":"2026-02-28T04:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.269687 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:26Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.280457 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:26Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.292837 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:26Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.302493 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:26Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.311139 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:26Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.320970 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:26Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.339432 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4bcdefe7-e804-46ac-ad0a-5e593614fd6b-serviceca\") pod \"node-ca-sbbcr\" (UID: \"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\") " pod="openshift-image-registry/node-ca-sbbcr" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.339467 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4bcdefe7-e804-46ac-ad0a-5e593614fd6b-host\") pod \"node-ca-sbbcr\" (UID: \"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\") " pod="openshift-image-registry/node-ca-sbbcr" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.339490 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vlnz\" (UniqueName: \"kubernetes.io/projected/4bcdefe7-e804-46ac-ad0a-5e593614fd6b-kube-api-access-2vlnz\") pod \"node-ca-sbbcr\" (UID: \"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\") " pod="openshift-image-registry/node-ca-sbbcr" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.369390 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.369433 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.369442 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.369457 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.369467 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:26Z","lastTransitionTime":"2026-02-28T04:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.441000 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4bcdefe7-e804-46ac-ad0a-5e593614fd6b-serviceca\") pod \"node-ca-sbbcr\" (UID: \"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\") " pod="openshift-image-registry/node-ca-sbbcr" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.441292 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4bcdefe7-e804-46ac-ad0a-5e593614fd6b-host\") pod \"node-ca-sbbcr\" (UID: \"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\") " pod="openshift-image-registry/node-ca-sbbcr" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.441394 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vlnz\" (UniqueName: \"kubernetes.io/projected/4bcdefe7-e804-46ac-ad0a-5e593614fd6b-kube-api-access-2vlnz\") pod \"node-ca-sbbcr\" (UID: \"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\") " pod="openshift-image-registry/node-ca-sbbcr" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.441386 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4bcdefe7-e804-46ac-ad0a-5e593614fd6b-host\") pod \"node-ca-sbbcr\" (UID: \"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\") " pod="openshift-image-registry/node-ca-sbbcr" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.442132 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4bcdefe7-e804-46ac-ad0a-5e593614fd6b-serviceca\") pod \"node-ca-sbbcr\" (UID: \"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\") " pod="openshift-image-registry/node-ca-sbbcr" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.458861 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vlnz\" (UniqueName: \"kubernetes.io/projected/4bcdefe7-e804-46ac-ad0a-5e593614fd6b-kube-api-access-2vlnz\") pod \"node-ca-sbbcr\" (UID: \"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\") " pod="openshift-image-registry/node-ca-sbbcr" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.472375 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.472486 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.472569 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.472663 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.472727 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:26Z","lastTransitionTime":"2026-02-28T04:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.488430 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sbbcr" Feb 28 04:11:26 crc kubenswrapper[5072]: W0228 04:11:26.500508 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bcdefe7_e804_46ac_ad0a_5e593614fd6b.slice/crio-c62ed5819d7179cf782ff5ccfe9c8e48d03f43fcfc445fe367758498629e1fbe WatchSource:0}: Error finding container c62ed5819d7179cf782ff5ccfe9c8e48d03f43fcfc445fe367758498629e1fbe: Status 404 returned error can't find the container with id c62ed5819d7179cf782ff5ccfe9c8e48d03f43fcfc445fe367758498629e1fbe Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.575062 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.575101 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.575113 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.575127 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.575136 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:26Z","lastTransitionTime":"2026-02-28T04:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.611558 5072 generic.go:334] "Generic (PLEG): container finished" podID="44befe72-7499-4d23-a09b-3d715817c3cb" containerID="26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6" exitCode=0 Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.611624 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" event={"ID":"44befe72-7499-4d23-a09b-3d715817c3cb","Type":"ContainerDied","Data":"26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6"} Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.612567 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sbbcr" event={"ID":"4bcdefe7-e804-46ac-ad0a-5e593614fd6b","Type":"ContainerStarted","Data":"c62ed5819d7179cf782ff5ccfe9c8e48d03f43fcfc445fe367758498629e1fbe"} Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.624457 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:26Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.638077 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:26Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.653510 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:26Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.658364 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.658384 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.658546 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:26 crc kubenswrapper[5072]: E0228 04:11:26.658920 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:11:26 crc kubenswrapper[5072]: E0228 04:11:26.659008 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:11:26 crc kubenswrapper[5072]: E0228 04:11:26.659118 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.659333 5072 scope.go:117] "RemoveContainer" containerID="16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.668283 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:26Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.679385 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.679452 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.679475 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.679499 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.679512 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:26Z","lastTransitionTime":"2026-02-28T04:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.681980 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:26Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.724563 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:26Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.746783 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:26Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.770375 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:26Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.782546 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.782589 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.782603 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.782624 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.782651 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:26Z","lastTransitionTime":"2026-02-28T04:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.785804 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:26Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.796372 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sbbcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vlnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sbbcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:26Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.808894 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:26Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.821456 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:26Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.833071 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:26Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.884986 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.885397 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.885410 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.885430 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.885442 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:26Z","lastTransitionTime":"2026-02-28T04:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.987906 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.987959 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.987970 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.987992 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:26 crc kubenswrapper[5072]: I0228 04:11:26.988004 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:26Z","lastTransitionTime":"2026-02-28T04:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.090899 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.090948 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.090962 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.090985 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.091001 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:27Z","lastTransitionTime":"2026-02-28T04:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.194711 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.194801 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.194820 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.194844 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.194863 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:27Z","lastTransitionTime":"2026-02-28T04:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.298176 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.298226 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.298241 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.298261 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.298273 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:27Z","lastTransitionTime":"2026-02-28T04:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.401241 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.401279 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.401287 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.401302 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.401312 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:27Z","lastTransitionTime":"2026-02-28T04:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.503999 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.504044 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.504054 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.504069 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.504082 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:27Z","lastTransitionTime":"2026-02-28T04:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.607390 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.607454 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.607471 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.607497 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.607515 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:27Z","lastTransitionTime":"2026-02-28T04:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.620436 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" event={"ID":"043491df-2577-47f6-9a5b-03fecada16ce","Type":"ContainerStarted","Data":"933f99232f97acea5775cc25e1d65289202161d2d9690145d21414ee2bb89553"} Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.625788 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" event={"ID":"44befe72-7499-4d23-a09b-3d715817c3cb","Type":"ContainerStarted","Data":"3d8c2e2523bd296f73003e740d2868163a7df4734c10be2c86fc1e2aa62c8352"} Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.628582 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.631977 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4"} Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.632495 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.633729 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sbbcr" event={"ID":"4bcdefe7-e804-46ac-ad0a-5e593614fd6b","Type":"ContainerStarted","Data":"1ad3ce336eebfb16a74b9fe4cfaba8f000d957b0174c41537a464fb9c35374b8"} Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.640046 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:27Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.672908 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933f99232f97acea5775cc25e1d65289202161d2d9690145d21414ee2bb89553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:27Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.689434 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.701265 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:27Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.710425 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.710457 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.710467 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.710486 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.710500 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:27Z","lastTransitionTime":"2026-02-28T04:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.718427 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:27Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.734088 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:27Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.759940 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:27Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.776080 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sbbcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vlnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sbbcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:27Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.795104 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:27Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.808850 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:27Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.813514 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.813561 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.813576 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.813600 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.813616 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:27Z","lastTransitionTime":"2026-02-28T04:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.823897 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:27Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.834834 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:27Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.851429 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:27Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.866306 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:27Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.881719 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:27Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.909368 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933f99232f97acea5775cc25e1d65289202161d2d9690145d21414ee2bb89553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:27Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.916710 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.916781 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.916801 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.916831 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.916853 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:27Z","lastTransitionTime":"2026-02-28T04:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.921330 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sbbcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad3ce336eebfb16a74b9fe4cfaba8f000d957b0174c41537a464fb9c35374b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vlnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sbbcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:27Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.936689 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:27Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.950044 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:27Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.963888 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:27Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:27 crc kubenswrapper[5072]: I0228 04:11:27.984773 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8c2e2523bd296f73003e740d2868163a7df4734c10be2c86fc1e2aa62c8352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:27Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.004039 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.020034 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.020095 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.020106 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.020123 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.020135 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:28Z","lastTransitionTime":"2026-02-28T04:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.036096 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"152bf264-e4f6-47a1-818d-291cf135c91d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8147830d1b3f4a182599f91c8c22e3a576b0eb490abb717fdb3f5d8935bed72d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81de0063ef550336022938bbee3deee5c85bd71d0fe56ebfb0cfce30359bebc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1ebfc293bb64884104f802401ae8cbd269c8237c7b152b1fb1832f631c8f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633c33aa1dff5f7df880dc47e4a688c1dd56705f9b3eda53cb1e830e91cfa791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7114e25f3e3e183fca6beb9468507c6d3276500896964a4446b4a00015c747d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.056334 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.068092 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.080225 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.093206 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.106480 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.122578 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.122609 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.122619 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.122632 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.122664 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:28Z","lastTransitionTime":"2026-02-28T04:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.225137 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.225168 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.225176 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.225188 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.225196 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:28Z","lastTransitionTime":"2026-02-28T04:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.327491 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.327543 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.327553 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.327573 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.327585 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:28Z","lastTransitionTime":"2026-02-28T04:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.430582 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.430628 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.430652 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.430667 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.430676 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:28Z","lastTransitionTime":"2026-02-28T04:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.533164 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.533212 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.533221 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.533238 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.533249 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:28Z","lastTransitionTime":"2026-02-28T04:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.636140 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.636364 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.636373 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.636388 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.636397 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:28Z","lastTransitionTime":"2026-02-28T04:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.646674 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.646716 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.646725 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.658433 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.658608 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:28 crc kubenswrapper[5072]: E0228 04:11:28.658785 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.659305 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:28 crc kubenswrapper[5072]: E0228 04:11:28.659423 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:11:28 crc kubenswrapper[5072]: E0228 04:11:28.659513 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.675957 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.691755 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"152bf264-e4f6-47a1-818d-291cf135c91d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8147830d1b3f4a182599f91c8c22e3a576b0eb490abb717fdb3f5d8935bed72d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81de0063ef550336022938bbee3deee5c85bd71d0fe56ebfb0cfce30359bebc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1ebfc293bb64884104f802401ae8cbd269c8237c7b152b1fb1832f631c8f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633c33aa1dff5f7df880dc47e4a688c1dd56705f9b3eda53cb1e830e91cfa791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7114e25f3e3e183fca6beb9468507c6d3276500896964a4446b4a00015c747d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.704567 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.715251 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.726406 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.737046 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.738464 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.738508 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.738531 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.738564 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.738587 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:28Z","lastTransitionTime":"2026-02-28T04:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.748687 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.757305 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.779674 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933f99232f97acea5775cc25e1d65289202161d2d9690145d21414ee2bb89553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.794501 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sbbcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad3ce336eebfb16a74b9fe4cfaba8f000d957b0174c41537a464fb9c35374b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vlnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sbbcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.809147 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.824670 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.835178 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.846240 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.846271 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.846280 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.846298 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.846311 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:28Z","lastTransitionTime":"2026-02-28T04:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.849004 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8c2e2523bd296f73003e740d2868163a7df4734c10be2c86fc1e2aa62c8352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.861288 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.873342 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.874702 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.886294 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.897022 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.909591 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8c2e2523bd296f73003e740d2868163a7df4734c10be2c86fc1e2aa62c8352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.917711 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sbbcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad3ce336eebfb16a74b9fe4cfaba8f000d957b0174c41537a464fb9c35374b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vlnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sbbcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.938115 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"152bf264-e4f6-47a1-818d-291cf135c91d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8147830d1b3f4a182599f91c8c22e3a576b0eb490abb717fdb3f5d8935bed72d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81de0063ef550336022938bbee3deee5c85bd71d0fe56ebfb0cfce30359bebc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1ebfc293bb64884104f802401ae8cbd269c8237c7b152b1fb1832f631c8f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633c33aa1dff5f7df880dc47e4a688c1dd56705f9b3eda53cb1e830e91cfa791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7114e25f3e3e183fca6beb9468507c6d3276500896964a4446b4a00015c747d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.948730 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.948769 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.948777 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.948792 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.948813 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:28Z","lastTransitionTime":"2026-02-28T04:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.950983 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.962249 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.971470 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.980214 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:28 crc kubenswrapper[5072]: I0228 04:11:28.990674 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.005166 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:29Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.014849 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:29Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.031912 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933f99232f97acea5775cc25e1d65289202161d2d9690145d21414ee2bb89553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:29Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.044356 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:29Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.051071 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.051112 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.051121 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.051136 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.051145 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:29Z","lastTransitionTime":"2026-02-28T04:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.062753 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933f99232f97acea5775cc25e1d65289202161d2d9690145d21414ee2bb89553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:29Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.073933 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:29Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.085768 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:29Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.096461 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:29Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.107996 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:29Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.120830 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8c2e2523bd296f73003e740d2868163a7df4734c10be2c86fc1e2aa62c8352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:29Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.131239 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sbbcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad3ce336eebfb16a74b9fe4cfaba8f000d957b0174c41537a464fb9c35374b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vlnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sbbcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:29Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.145607 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:29Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.153143 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.153176 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.153186 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.153201 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.153214 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:29Z","lastTransitionTime":"2026-02-28T04:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.160304 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:29Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.170581 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:29Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.179004 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:29Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.189006 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:29Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.204119 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"152bf264-e4f6-47a1-818d-291cf135c91d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8147830d1b3f4a182599f91c8c22e3a576b0eb490abb717fdb3f5d8935bed72d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81de0063ef550336022938bbee3deee5c85bd71d0fe56ebfb0cfce30359bebc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1ebfc293bb64884104f802401ae8cbd269c8237c7b152b1fb1832f631c8f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633c33aa1dff5f7df880dc47e4a688c1dd56705f9b3eda53cb1e830e91cfa791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7114e25f3e3e183fca6beb9468507c6d3276500896964a4446b4a00015c747d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:29Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.255655 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.255780 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.255797 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.255820 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.255831 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:29Z","lastTransitionTime":"2026-02-28T04:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.358458 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.358780 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.358792 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.358809 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.358820 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:29Z","lastTransitionTime":"2026-02-28T04:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.460523 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.460562 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.460571 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.460585 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.460594 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:29Z","lastTransitionTime":"2026-02-28T04:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.562428 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.562470 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.562480 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.562495 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.562506 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:29Z","lastTransitionTime":"2026-02-28T04:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.664941 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.664982 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.664993 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.665168 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.665184 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:29Z","lastTransitionTime":"2026-02-28T04:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.767190 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.767234 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.767243 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.767265 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.767277 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:29Z","lastTransitionTime":"2026-02-28T04:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.869264 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.869293 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.869300 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.869313 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.869322 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:29Z","lastTransitionTime":"2026-02-28T04:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.971943 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.971994 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.972031 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.972055 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:29 crc kubenswrapper[5072]: I0228 04:11:29.972069 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:29Z","lastTransitionTime":"2026-02-28T04:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.075018 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.075060 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.075072 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.075087 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.075098 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:30Z","lastTransitionTime":"2026-02-28T04:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.177949 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.178029 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.178056 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.178086 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.178109 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:30Z","lastTransitionTime":"2026-02-28T04:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.227525 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.227553 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.227561 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.227573 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.227582 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:30Z","lastTransitionTime":"2026-02-28T04:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:30 crc kubenswrapper[5072]: E0228 04:11:30.248139 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:30Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.251562 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.251738 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.251829 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.251912 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.252021 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:30Z","lastTransitionTime":"2026-02-28T04:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:30 crc kubenswrapper[5072]: E0228 04:11:30.268961 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:30Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.272531 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.272733 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.272860 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.272959 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.273051 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:30Z","lastTransitionTime":"2026-02-28T04:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:30 crc kubenswrapper[5072]: E0228 04:11:30.287793 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:30Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.293441 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.293480 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.293489 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.293508 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.293526 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:30Z","lastTransitionTime":"2026-02-28T04:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:30 crc kubenswrapper[5072]: E0228 04:11:30.313102 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:30Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.315985 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.316106 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.316189 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.316273 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.316359 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:30Z","lastTransitionTime":"2026-02-28T04:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:30 crc kubenswrapper[5072]: E0228 04:11:30.329225 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:30Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:30 crc kubenswrapper[5072]: E0228 04:11:30.329344 5072 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.330620 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.330675 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.330686 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.330702 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.330714 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:30Z","lastTransitionTime":"2026-02-28T04:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.432621 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.432675 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.432684 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.432698 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.432708 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:30Z","lastTransitionTime":"2026-02-28T04:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.534592 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.534626 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.534635 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.534661 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.534671 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:30Z","lastTransitionTime":"2026-02-28T04:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.636386 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.636438 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.636447 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.636465 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.636476 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:30Z","lastTransitionTime":"2026-02-28T04:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.657845 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:30 crc kubenswrapper[5072]: E0228 04:11:30.657925 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.657969 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.658027 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:30 crc kubenswrapper[5072]: E0228 04:11:30.658053 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:11:30 crc kubenswrapper[5072]: E0228 04:11:30.658166 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.738847 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.739073 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.739138 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.739211 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.739278 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:30Z","lastTransitionTime":"2026-02-28T04:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.841208 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.841261 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.841275 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.841290 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.841300 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:30Z","lastTransitionTime":"2026-02-28T04:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.943559 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.943855 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.943944 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.944012 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:30 crc kubenswrapper[5072]: I0228 04:11:30.944069 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:30Z","lastTransitionTime":"2026-02-28T04:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.046081 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.046120 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.046128 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.046142 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.046152 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:31Z","lastTransitionTime":"2026-02-28T04:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.148428 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.148487 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.148505 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.148529 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.148546 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:31Z","lastTransitionTime":"2026-02-28T04:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.252034 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.252081 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.252097 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.252115 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.252133 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:31Z","lastTransitionTime":"2026-02-28T04:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.354606 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.354654 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.354665 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.354679 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.354688 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:31Z","lastTransitionTime":"2026-02-28T04:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.456800 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.456830 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.456839 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.456851 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.456860 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:31Z","lastTransitionTime":"2026-02-28T04:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.559283 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.559381 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.559400 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.559423 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.559439 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:31Z","lastTransitionTime":"2026-02-28T04:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.661242 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.661405 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.661849 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.661874 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.661887 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:31Z","lastTransitionTime":"2026-02-28T04:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.662822 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kfpqp_043491df-2577-47f6-9a5b-03fecada16ce/ovnkube-controller/0.log" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.665943 5072 generic.go:334] "Generic (PLEG): container finished" podID="043491df-2577-47f6-9a5b-03fecada16ce" containerID="933f99232f97acea5775cc25e1d65289202161d2d9690145d21414ee2bb89553" exitCode=1 Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.665978 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" event={"ID":"043491df-2577-47f6-9a5b-03fecada16ce","Type":"ContainerDied","Data":"933f99232f97acea5775cc25e1d65289202161d2d9690145d21414ee2bb89553"} Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.666841 5072 scope.go:117] "RemoveContainer" containerID="933f99232f97acea5775cc25e1d65289202161d2d9690145d21414ee2bb89553" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.681896 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:31Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.709659 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933f99232f97acea5775cc25e1d65289202161d2d9690145d21414ee2bb89553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://933f99232f97acea5775cc25e1d65289202161d2d9690145d21414ee2bb89553\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:11:31Z\\\",\\\"message\\\":\\\"m k8s.io/client-go/informers/factory.go:160\\\\nI0228 04:11:31.443824 6887 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0228 04:11:31.443844 6887 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0228 04:11:31.443866 6887 handler.go:208] Removed *v1.Node event handler 7\\\\nI0228 04:11:31.443875 6887 handler.go:208] Removed *v1.Node event handler 2\\\\nI0228 04:11:31.443906 6887 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0228 04:11:31.444076 6887 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0228 04:11:31.444660 6887 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0228 04:11:31.444701 6887 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0228 04:11:31.444707 6887 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0228 04:11:31.444746 6887 factory.go:656] Stopping watch factory\\\\nI0228 04:11:31.444766 6887 ovnkube.go:599] Stopped ovnkube\\\\nI0228 04:11:31.444781 6887 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0228 04:11:31.444794 6887 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0228 04\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:31Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.729021 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:31Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.747407 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:31Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.764072 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.764098 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.764106 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.764119 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.764129 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:31Z","lastTransitionTime":"2026-02-28T04:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.770634 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8c2e2523bd296f73003e740d2868163a7df4734c10be2c86fc1e2aa62c8352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:31Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.781029 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sbbcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad3ce336eebfb16a74b9fe4cfaba8f000d957b0174c41537a464fb9c35374b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vlnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sbbcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:31Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.795106 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:31Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.806869 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:31Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.819718 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:31Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.832550 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:31Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.847099 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:31Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.865133 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"152bf264-e4f6-47a1-818d-291cf135c91d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8147830d1b3f4a182599f91c8c22e3a576b0eb490abb717fdb3f5d8935bed72d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81de0063ef550336022938bbee3deee5c85bd71d0fe56ebfb0cfce30359bebc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1ebfc293bb64884104f802401ae8cbd269c8237c7b152b1fb1832f631c8f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633c33aa1dff5f7df880dc47e4a688c1dd56705f9b3eda53cb1e830e91cfa791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7114e25f3e3e183fca6beb9468507c6d3276500896964a4446b4a00015c747d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:31Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.867413 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.867452 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.867465 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.867483 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.867496 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:31Z","lastTransitionTime":"2026-02-28T04:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.878280 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:31Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.890942 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:31Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.970054 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.970081 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.970089 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.970101 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:31 crc kubenswrapper[5072]: I0228 04:11:31.970109 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:31Z","lastTransitionTime":"2026-02-28T04:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.072548 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.072577 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.072586 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.072601 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.072618 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:32Z","lastTransitionTime":"2026-02-28T04:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.094354 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc"] Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.094954 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.098617 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.098769 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.106680 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv2ch\" (UniqueName: \"kubernetes.io/projected/a0f43af6-3fa2-43c2-bcda-a20612fa4909-kube-api-access-cv2ch\") pod \"ovnkube-control-plane-749d76644c-7g5vc\" (UID: \"a0f43af6-3fa2-43c2-bcda-a20612fa4909\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.106755 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a0f43af6-3fa2-43c2-bcda-a20612fa4909-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7g5vc\" (UID: \"a0f43af6-3fa2-43c2-bcda-a20612fa4909\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.106788 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a0f43af6-3fa2-43c2-bcda-a20612fa4909-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7g5vc\" (UID: \"a0f43af6-3fa2-43c2-bcda-a20612fa4909\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.106814 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a0f43af6-3fa2-43c2-bcda-a20612fa4909-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7g5vc\" (UID: \"a0f43af6-3fa2-43c2-bcda-a20612fa4909\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.116997 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.143816 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.169732 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933f99232f97acea5775cc25e1d65289202161d2d9690145d21414ee2bb89553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://933f99232f97acea5775cc25e1d65289202161d2d9690145d21414ee2bb89553\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:11:31Z\\\",\\\"message\\\":\\\"m k8s.io/client-go/informers/factory.go:160\\\\nI0228 04:11:31.443824 6887 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0228 04:11:31.443844 6887 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0228 04:11:31.443866 6887 handler.go:208] Removed *v1.Node event handler 7\\\\nI0228 04:11:31.443875 6887 handler.go:208] Removed *v1.Node event handler 2\\\\nI0228 04:11:31.443906 6887 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0228 04:11:31.444076 6887 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0228 04:11:31.444660 6887 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0228 04:11:31.444701 6887 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0228 04:11:31.444707 6887 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0228 04:11:31.444746 6887 factory.go:656] Stopping watch factory\\\\nI0228 04:11:31.444766 6887 ovnkube.go:599] Stopped ovnkube\\\\nI0228 04:11:31.444781 6887 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0228 04:11:31.444794 6887 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0228 04\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.175203 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.175239 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.175250 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.175268 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.175280 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:32Z","lastTransitionTime":"2026-02-28T04:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.181673 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0f43af6-3fa2-43c2-bcda-a20612fa4909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7g5vc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.192995 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.204147 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.207312 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv2ch\" (UniqueName: \"kubernetes.io/projected/a0f43af6-3fa2-43c2-bcda-a20612fa4909-kube-api-access-cv2ch\") pod \"ovnkube-control-plane-749d76644c-7g5vc\" (UID: \"a0f43af6-3fa2-43c2-bcda-a20612fa4909\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.207368 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a0f43af6-3fa2-43c2-bcda-a20612fa4909-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7g5vc\" (UID: \"a0f43af6-3fa2-43c2-bcda-a20612fa4909\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.207392 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a0f43af6-3fa2-43c2-bcda-a20612fa4909-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7g5vc\" (UID: \"a0f43af6-3fa2-43c2-bcda-a20612fa4909\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.207418 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a0f43af6-3fa2-43c2-bcda-a20612fa4909-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7g5vc\" (UID: \"a0f43af6-3fa2-43c2-bcda-a20612fa4909\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.207984 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a0f43af6-3fa2-43c2-bcda-a20612fa4909-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7g5vc\" (UID: \"a0f43af6-3fa2-43c2-bcda-a20612fa4909\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.208258 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a0f43af6-3fa2-43c2-bcda-a20612fa4909-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7g5vc\" (UID: \"a0f43af6-3fa2-43c2-bcda-a20612fa4909\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.212968 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a0f43af6-3fa2-43c2-bcda-a20612fa4909-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7g5vc\" (UID: \"a0f43af6-3fa2-43c2-bcda-a20612fa4909\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.218925 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.233994 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8c2e2523bd296f73003e740d2868163a7df4734c10be2c86fc1e2aa62c8352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.234240 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv2ch\" (UniqueName: \"kubernetes.io/projected/a0f43af6-3fa2-43c2-bcda-a20612fa4909-kube-api-access-cv2ch\") pod \"ovnkube-control-plane-749d76644c-7g5vc\" (UID: \"a0f43af6-3fa2-43c2-bcda-a20612fa4909\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.244565 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sbbcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad3ce336eebfb16a74b9fe4cfaba8f000d957b0174c41537a464fb9c35374b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vlnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sbbcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.262376 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"152bf264-e4f6-47a1-818d-291cf135c91d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8147830d1b3f4a182599f91c8c22e3a576b0eb490abb717fdb3f5d8935bed72d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81de0063ef550336022938bbee3deee5c85bd71d0fe56ebfb0cfce30359bebc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1ebfc293bb64884104f802401ae8cbd269c8237c7b152b1fb1832f631c8f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633c33aa1dff5f7df880dc47e4a688c1dd56705f9b3eda53cb1e830e91cfa791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7114e25f3e3e183fca6beb9468507c6d3276500896964a4446b4a00015c747d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.278084 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.278133 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.278146 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.278162 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.278177 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:32Z","lastTransitionTime":"2026-02-28T04:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.278155 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.296300 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.316486 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.327208 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.341041 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.380024 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.380142 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.380153 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.380166 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.380175 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:32Z","lastTransitionTime":"2026-02-28T04:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.408403 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.482712 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.482745 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.482754 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.482767 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.482776 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:32Z","lastTransitionTime":"2026-02-28T04:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.584959 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.584987 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.584995 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.585010 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.585018 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:32Z","lastTransitionTime":"2026-02-28T04:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.658107 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:32 crc kubenswrapper[5072]: E0228 04:11:32.658225 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.658588 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:32 crc kubenswrapper[5072]: E0228 04:11:32.658692 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.658867 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:32 crc kubenswrapper[5072]: E0228 04:11:32.658965 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.670024 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kfpqp_043491df-2577-47f6-9a5b-03fecada16ce/ovnkube-controller/0.log" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.672047 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" event={"ID":"043491df-2577-47f6-9a5b-03fecada16ce","Type":"ContainerStarted","Data":"66bab07afadf3ca41deb6a18b59e4602c3dd1762a977e1e185f2670e4b14e4a2"} Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.672307 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.673026 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" event={"ID":"a0f43af6-3fa2-43c2-bcda-a20612fa4909","Type":"ContainerStarted","Data":"07070b857b54194b346d210599b65e3a9b60e82fb23592a5f2cba9406803f707"} Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.681723 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.687084 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.687118 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.687130 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.687147 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.687160 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:32Z","lastTransitionTime":"2026-02-28T04:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.699147 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66bab07afadf3ca41deb6a18b59e4602c3dd1762a977e1e185f2670e4b14e4a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://933f99232f97acea5775cc25e1d65289202161d2d9690145d21414ee2bb89553\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:11:31Z\\\",\\\"message\\\":\\\"m k8s.io/client-go/informers/factory.go:160\\\\nI0228 04:11:31.443824 6887 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0228 04:11:31.443844 6887 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0228 04:11:31.443866 6887 handler.go:208] Removed *v1.Node event handler 7\\\\nI0228 04:11:31.443875 6887 handler.go:208] Removed *v1.Node event handler 2\\\\nI0228 04:11:31.443906 6887 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0228 04:11:31.444076 6887 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0228 04:11:31.444660 6887 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0228 04:11:31.444701 6887 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0228 04:11:31.444707 6887 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0228 04:11:31.444746 6887 factory.go:656] Stopping watch factory\\\\nI0228 04:11:31.444766 6887 ovnkube.go:599] Stopped ovnkube\\\\nI0228 04:11:31.444781 6887 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0228 04:11:31.444794 6887 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0228 04\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.709702 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.721848 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8c2e2523bd296f73003e740d2868163a7df4734c10be2c86fc1e2aa62c8352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.730118 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sbbcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad3ce336eebfb16a74b9fe4cfaba8f000d957b0174c41537a464fb9c35374b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vlnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sbbcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.738854 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0f43af6-3fa2-43c2-bcda-a20612fa4909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7g5vc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.774722 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.789664 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.789712 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.789723 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.789741 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.789753 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:32Z","lastTransitionTime":"2026-02-28T04:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.800883 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.809205 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-95gbg"] Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.810005 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:11:32 crc kubenswrapper[5072]: E0228 04:11:32.810086 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.815508 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.823940 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.834520 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.850222 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"152bf264-e4f6-47a1-818d-291cf135c91d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8147830d1b3f4a182599f91c8c22e3a576b0eb490abb717fdb3f5d8935bed72d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81de0063ef550336022938bbee3deee5c85bd71d0fe56ebfb0cfce30359bebc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1ebfc293bb64884104f802401ae8cbd269c8237c7b152b1fb1832f631c8f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633c33aa1dff5f7df880dc47e4a688c1dd56705f9b3eda53cb1e830e91cfa791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7114e25f3e3e183fca6beb9468507c6d3276500896964a4446b4a00015c747d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.861681 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.871955 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.883712 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.891657 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.891694 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.891707 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.891727 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.891740 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:32Z","lastTransitionTime":"2026-02-28T04:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.894236 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.909009 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66bab07afadf3ca41deb6a18b59e4602c3dd1762a977e1e185f2670e4b14e4a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://933f99232f97acea5775cc25e1d65289202161d2d9690145d21414ee2bb89553\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:11:31Z\\\",\\\"message\\\":\\\"m k8s.io/client-go/informers/factory.go:160\\\\nI0228 04:11:31.443824 6887 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0228 04:11:31.443844 6887 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0228 04:11:31.443866 6887 handler.go:208] Removed *v1.Node event handler 7\\\\nI0228 04:11:31.443875 6887 handler.go:208] Removed *v1.Node event handler 2\\\\nI0228 04:11:31.443906 6887 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0228 04:11:31.444076 6887 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0228 04:11:31.444660 6887 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0228 04:11:31.444701 6887 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0228 04:11:31.444707 6887 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0228 04:11:31.444746 6887 factory.go:656] Stopping watch factory\\\\nI0228 04:11:31.444766 6887 ovnkube.go:599] Stopped ovnkube\\\\nI0228 04:11:31.444781 6887 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0228 04:11:31.444794 6887 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0228 04\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.913104 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/109581ed-36ab-4625-bf7e-bcdecb30e35a-metrics-certs\") pod \"network-metrics-daemon-95gbg\" (UID: \"109581ed-36ab-4625-bf7e-bcdecb30e35a\") " pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.913141 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvjpz\" (UniqueName: \"kubernetes.io/projected/109581ed-36ab-4625-bf7e-bcdecb30e35a-kube-api-access-xvjpz\") pod \"network-metrics-daemon-95gbg\" (UID: \"109581ed-36ab-4625-bf7e-bcdecb30e35a\") " pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.919079 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.929340 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.939089 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.950955 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8c2e2523bd296f73003e740d2868163a7df4734c10be2c86fc1e2aa62c8352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.959956 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sbbcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad3ce336eebfb16a74b9fe4cfaba8f000d957b0174c41537a464fb9c35374b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vlnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sbbcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.969131 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0f43af6-3fa2-43c2-bcda-a20612fa4909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7g5vc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.985401 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"152bf264-e4f6-47a1-818d-291cf135c91d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8147830d1b3f4a182599f91c8c22e3a576b0eb490abb717fdb3f5d8935bed72d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81de0063ef550336022938bbee3deee5c85bd71d0fe56ebfb0cfce30359bebc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1ebfc293bb64884104f802401ae8cbd269c8237c7b152b1fb1832f631c8f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633c33aa1dff5f7df880dc47e4a688c1dd56705f9b3eda53cb1e830e91cfa791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7114e25f3e3e183fca6beb9468507c6d3276500896964a4446b4a00015c747d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.994234 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.994270 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.994279 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.994295 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.994306 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:32Z","lastTransitionTime":"2026-02-28T04:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:32 crc kubenswrapper[5072]: I0228 04:11:32.997768 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.008184 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:33Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.014013 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/109581ed-36ab-4625-bf7e-bcdecb30e35a-metrics-certs\") pod \"network-metrics-daemon-95gbg\" (UID: \"109581ed-36ab-4625-bf7e-bcdecb30e35a\") " pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.014053 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvjpz\" (UniqueName: \"kubernetes.io/projected/109581ed-36ab-4625-bf7e-bcdecb30e35a-kube-api-access-xvjpz\") pod \"network-metrics-daemon-95gbg\" (UID: \"109581ed-36ab-4625-bf7e-bcdecb30e35a\") " pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:11:33 crc kubenswrapper[5072]: E0228 04:11:33.014189 5072 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 04:11:33 crc kubenswrapper[5072]: E0228 04:11:33.014257 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/109581ed-36ab-4625-bf7e-bcdecb30e35a-metrics-certs podName:109581ed-36ab-4625-bf7e-bcdecb30e35a nodeName:}" failed. No retries permitted until 2026-02-28 04:11:33.514238947 +0000 UTC m=+115.508969139 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/109581ed-36ab-4625-bf7e-bcdecb30e35a-metrics-certs") pod "network-metrics-daemon-95gbg" (UID: "109581ed-36ab-4625-bf7e-bcdecb30e35a") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.017512 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:33Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.025491 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:33Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.029886 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvjpz\" (UniqueName: \"kubernetes.io/projected/109581ed-36ab-4625-bf7e-bcdecb30e35a-kube-api-access-xvjpz\") pod \"network-metrics-daemon-95gbg\" (UID: \"109581ed-36ab-4625-bf7e-bcdecb30e35a\") " pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.040052 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:33Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.050470 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:33Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.059549 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95gbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"109581ed-36ab-4625-bf7e-bcdecb30e35a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95gbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:33Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.096637 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.096893 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.096955 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.097023 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.097080 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:33Z","lastTransitionTime":"2026-02-28T04:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.200106 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.200171 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.200181 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.200198 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.200207 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:33Z","lastTransitionTime":"2026-02-28T04:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.302330 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.302382 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.302398 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.302429 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.302447 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:33Z","lastTransitionTime":"2026-02-28T04:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.404615 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.404671 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.404683 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.404702 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.404713 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:33Z","lastTransitionTime":"2026-02-28T04:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.507425 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.507464 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.507473 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.507487 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.507499 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:33Z","lastTransitionTime":"2026-02-28T04:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.518897 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/109581ed-36ab-4625-bf7e-bcdecb30e35a-metrics-certs\") pod \"network-metrics-daemon-95gbg\" (UID: \"109581ed-36ab-4625-bf7e-bcdecb30e35a\") " pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:11:33 crc kubenswrapper[5072]: E0228 04:11:33.519054 5072 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 04:11:33 crc kubenswrapper[5072]: E0228 04:11:33.519115 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/109581ed-36ab-4625-bf7e-bcdecb30e35a-metrics-certs podName:109581ed-36ab-4625-bf7e-bcdecb30e35a nodeName:}" failed. No retries permitted until 2026-02-28 04:11:34.51910227 +0000 UTC m=+116.513832462 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/109581ed-36ab-4625-bf7e-bcdecb30e35a-metrics-certs") pod "network-metrics-daemon-95gbg" (UID: "109581ed-36ab-4625-bf7e-bcdecb30e35a") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.610132 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.610185 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.610198 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.610215 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.610230 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:33Z","lastTransitionTime":"2026-02-28T04:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.677137 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" event={"ID":"a0f43af6-3fa2-43c2-bcda-a20612fa4909","Type":"ContainerStarted","Data":"b68fc83cc9ead75b24b42db57b3c2303adc3fcc02da67740a33a8c6b9d300871"} Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.677395 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" event={"ID":"a0f43af6-3fa2-43c2-bcda-a20612fa4909","Type":"ContainerStarted","Data":"ad6bb01bb4e8384cf21f3c9685f05cef09f2ff98ba02254763c92b9caba75b3b"} Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.680382 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kfpqp_043491df-2577-47f6-9a5b-03fecada16ce/ovnkube-controller/1.log" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.686695 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kfpqp_043491df-2577-47f6-9a5b-03fecada16ce/ovnkube-controller/0.log" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.690947 5072 generic.go:334] "Generic (PLEG): container finished" podID="043491df-2577-47f6-9a5b-03fecada16ce" containerID="66bab07afadf3ca41deb6a18b59e4602c3dd1762a977e1e185f2670e4b14e4a2" exitCode=1 Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.690997 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" event={"ID":"043491df-2577-47f6-9a5b-03fecada16ce","Type":"ContainerDied","Data":"66bab07afadf3ca41deb6a18b59e4602c3dd1762a977e1e185f2670e4b14e4a2"} Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.691036 5072 scope.go:117] "RemoveContainer" containerID="933f99232f97acea5775cc25e1d65289202161d2d9690145d21414ee2bb89553" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.695493 5072 scope.go:117] "RemoveContainer" containerID="66bab07afadf3ca41deb6a18b59e4602c3dd1762a977e1e185f2670e4b14e4a2" Feb 28 04:11:33 crc kubenswrapper[5072]: E0228 04:11:33.695998 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kfpqp_openshift-ovn-kubernetes(043491df-2577-47f6-9a5b-03fecada16ce)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" podUID="043491df-2577-47f6-9a5b-03fecada16ce" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.698734 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:33Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.712538 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.712576 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.712588 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.712607 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.712618 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:33Z","lastTransitionTime":"2026-02-28T04:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.715984 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95gbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"109581ed-36ab-4625-bf7e-bcdecb30e35a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95gbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:33Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.728685 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:33Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.753758 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66bab07afadf3ca41deb6a18b59e4602c3dd1762a977e1e185f2670e4b14e4a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://933f99232f97acea5775cc25e1d65289202161d2d9690145d21414ee2bb89553\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:11:31Z\\\",\\\"message\\\":\\\"m k8s.io/client-go/informers/factory.go:160\\\\nI0228 04:11:31.443824 6887 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0228 04:11:31.443844 6887 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0228 04:11:31.443866 6887 handler.go:208] Removed *v1.Node event handler 7\\\\nI0228 04:11:31.443875 6887 handler.go:208] Removed *v1.Node event handler 2\\\\nI0228 04:11:31.443906 6887 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0228 04:11:31.444076 6887 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0228 04:11:31.444660 6887 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0228 04:11:31.444701 6887 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0228 04:11:31.444707 6887 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0228 04:11:31.444746 6887 factory.go:656] Stopping watch factory\\\\nI0228 04:11:31.444766 6887 ovnkube.go:599] Stopped ovnkube\\\\nI0228 04:11:31.444781 6887 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0228 04:11:31.444794 6887 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0228 04\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:33Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.766840 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sbbcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad3ce336eebfb16a74b9fe4cfaba8f000d957b0174c41537a464fb9c35374b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vlnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sbbcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:33Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.780337 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0f43af6-3fa2-43c2-bcda-a20612fa4909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad6bb01bb4e8384cf21f3c9685f05cef09f2ff98ba02254763c92b9caba75b3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68fc83cc9ead75b24b42db57b3c2303adc3fcc02da67740a33a8c6b9d300871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7g5vc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:33Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.793450 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:33Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.805142 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:33Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.814765 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.814979 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.815233 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.815452 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.815570 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:33Z","lastTransitionTime":"2026-02-28T04:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.815866 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:33Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.828392 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8c2e2523bd296f73003e740d2868163a7df4734c10be2c86fc1e2aa62c8352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:33Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.838757 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:33Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.855705 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"152bf264-e4f6-47a1-818d-291cf135c91d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8147830d1b3f4a182599f91c8c22e3a576b0eb490abb717fdb3f5d8935bed72d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81de0063ef550336022938bbee3deee5c85bd71d0fe56ebfb0cfce30359bebc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1ebfc293bb64884104f802401ae8cbd269c8237c7b152b1fb1832f631c8f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633c33aa1dff5f7df880dc47e4a688c1dd56705f9b3eda53cb1e830e91cfa791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7114e25f3e3e183fca6beb9468507c6d3276500896964a4446b4a00015c747d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:33Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.867745 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:33Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.878107 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:33Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.888161 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:33Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.897030 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:33Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.907165 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:33Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.917741 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.917777 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.917787 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.917802 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.917812 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:33Z","lastTransitionTime":"2026-02-28T04:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.918740 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:33Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.928915 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:33Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.941575 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8c2e2523bd296f73003e740d2868163a7df4734c10be2c86fc1e2aa62c8352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:33Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.951274 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sbbcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad3ce336eebfb16a74b9fe4cfaba8f000d957b0174c41537a464fb9c35374b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vlnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sbbcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:33Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.961164 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0f43af6-3fa2-43c2-bcda-a20612fa4909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad6bb01bb4e8384cf21f3c9685f05cef09f2ff98ba02254763c92b9caba75b3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68fc83cc9ead75b24b42db57b3c2303adc3fcc02da67740a33a8c6b9d300871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7g5vc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:33Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.978198 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"152bf264-e4f6-47a1-818d-291cf135c91d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8147830d1b3f4a182599f91c8c22e3a576b0eb490abb717fdb3f5d8935bed72d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81de0063ef550336022938bbee3deee5c85bd71d0fe56ebfb0cfce30359bebc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1ebfc293bb64884104f802401ae8cbd269c8237c7b152b1fb1832f631c8f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633c33aa1dff5f7df880dc47e4a688c1dd56705f9b3eda53cb1e830e91cfa791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7114e25f3e3e183fca6beb9468507c6d3276500896964a4446b4a00015c747d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:33Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:33 crc kubenswrapper[5072]: I0228 04:11:33.991182 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:33Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.002590 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:34Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.013458 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:34Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.019590 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.019622 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.019631 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.019669 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.019679 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:34Z","lastTransitionTime":"2026-02-28T04:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.023532 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:34Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.037814 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:34Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.049168 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:34Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.058547 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95gbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"109581ed-36ab-4625-bf7e-bcdecb30e35a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95gbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:34Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.067667 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:34Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.084377 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66bab07afadf3ca41deb6a18b59e4602c3dd1762a977e1e185f2670e4b14e4a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://933f99232f97acea5775cc25e1d65289202161d2d9690145d21414ee2bb89553\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:11:31Z\\\",\\\"message\\\":\\\"m k8s.io/client-go/informers/factory.go:160\\\\nI0228 04:11:31.443824 6887 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0228 04:11:31.443844 6887 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0228 04:11:31.443866 6887 handler.go:208] Removed *v1.Node event handler 7\\\\nI0228 04:11:31.443875 6887 handler.go:208] Removed *v1.Node event handler 2\\\\nI0228 04:11:31.443906 6887 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0228 04:11:31.444076 6887 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0228 04:11:31.444660 6887 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0228 04:11:31.444701 6887 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0228 04:11:31.444707 6887 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0228 04:11:31.444746 6887 factory.go:656] Stopping watch factory\\\\nI0228 04:11:31.444766 6887 ovnkube.go:599] Stopped ovnkube\\\\nI0228 04:11:31.444781 6887 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0228 04:11:31.444794 6887 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0228 04\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66bab07afadf3ca41deb6a18b59e4602c3dd1762a977e1e185f2670e4b14e4a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"message\\\":\\\"dler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z]\\\\nI0228 04:11:32.536537 7091 services_controller.go:434] Service openshift-authentication/oauth-openshift retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{oauth-openshift openshift-authentication 327e9277-4a34-458b-9afd-a4d0b83d7a80 5000 0 2025-02-23 05:23:11 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:oauth-openshift] map[operator.openshift.io/spec-hash:d9e6d53076d47ab2d123d8b1ba8ec6543488d973dcc4e02349493cd1c33bce83 service.alpha.openshift.io/serving-cert-secret-name:v4-0-config-system-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:34Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.122043 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.122081 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.122090 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.122105 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.122115 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:34Z","lastTransitionTime":"2026-02-28T04:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.224354 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.224588 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.224681 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.224818 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.224911 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:34Z","lastTransitionTime":"2026-02-28T04:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.327567 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.327658 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.327672 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.327688 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.327700 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:34Z","lastTransitionTime":"2026-02-28T04:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.430319 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.430371 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.430436 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.430455 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.430851 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:34Z","lastTransitionTime":"2026-02-28T04:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.562269 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/109581ed-36ab-4625-bf7e-bcdecb30e35a-metrics-certs\") pod \"network-metrics-daemon-95gbg\" (UID: \"109581ed-36ab-4625-bf7e-bcdecb30e35a\") " pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:11:34 crc kubenswrapper[5072]: E0228 04:11:34.562474 5072 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 04:11:34 crc kubenswrapper[5072]: E0228 04:11:34.562559 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/109581ed-36ab-4625-bf7e-bcdecb30e35a-metrics-certs podName:109581ed-36ab-4625-bf7e-bcdecb30e35a nodeName:}" failed. No retries permitted until 2026-02-28 04:11:36.562537876 +0000 UTC m=+118.557268128 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/109581ed-36ab-4625-bf7e-bcdecb30e35a-metrics-certs") pod "network-metrics-daemon-95gbg" (UID: "109581ed-36ab-4625-bf7e-bcdecb30e35a") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.564952 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.565015 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.565036 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.565213 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.565246 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:34Z","lastTransitionTime":"2026-02-28T04:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.658410 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.658439 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:34 crc kubenswrapper[5072]: E0228 04:11:34.659720 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:11:34 crc kubenswrapper[5072]: E0228 04:11:34.659415 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.659815 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.659904 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:11:34 crc kubenswrapper[5072]: E0228 04:11:34.660062 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:11:34 crc kubenswrapper[5072]: E0228 04:11:34.664481 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.667415 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.667546 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.667632 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.667748 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.667822 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:34Z","lastTransitionTime":"2026-02-28T04:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.694734 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kfpqp_043491df-2577-47f6-9a5b-03fecada16ce/ovnkube-controller/1.log" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.698274 5072 scope.go:117] "RemoveContainer" containerID="66bab07afadf3ca41deb6a18b59e4602c3dd1762a977e1e185f2670e4b14e4a2" Feb 28 04:11:34 crc kubenswrapper[5072]: E0228 04:11:34.698601 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kfpqp_openshift-ovn-kubernetes(043491df-2577-47f6-9a5b-03fecada16ce)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" podUID="043491df-2577-47f6-9a5b-03fecada16ce" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.711375 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:34Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.725074 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8c2e2523bd296f73003e740d2868163a7df4734c10be2c86fc1e2aa62c8352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:34Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.733328 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sbbcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad3ce336eebfb16a74b9fe4cfaba8f000d957b0174c41537a464fb9c35374b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vlnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sbbcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:34Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.746715 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0f43af6-3fa2-43c2-bcda-a20612fa4909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad6bb01bb4e8384cf21f3c9685f05cef09f2ff98ba02254763c92b9caba75b3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68fc83cc9ead75b24b42db57b3c2303adc3fcc02da67740a33a8c6b9d300871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7g5vc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:34Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.761597 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:34Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.769784 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.771437 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.771451 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.771465 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.771476 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:34Z","lastTransitionTime":"2026-02-28T04:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.774812 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:34Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.786771 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:34Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.796027 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:34Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.806442 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:34Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.823863 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"152bf264-e4f6-47a1-818d-291cf135c91d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8147830d1b3f4a182599f91c8c22e3a576b0eb490abb717fdb3f5d8935bed72d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81de0063ef550336022938bbee3deee5c85bd71d0fe56ebfb0cfce30359bebc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1ebfc293bb64884104f802401ae8cbd269c8237c7b152b1fb1832f631c8f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633c33aa1dff5f7df880dc47e4a688c1dd56705f9b3eda53cb1e830e91cfa791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7114e25f3e3e183fca6beb9468507c6d3276500896964a4446b4a00015c747d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:34Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.836093 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:34Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.847029 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:34Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.857359 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95gbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"109581ed-36ab-4625-bf7e-bcdecb30e35a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95gbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:34Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.867696 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:34Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.873838 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.873880 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.873892 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.873909 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.873920 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:34Z","lastTransitionTime":"2026-02-28T04:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.877348 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:34Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.898415 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66bab07afadf3ca41deb6a18b59e4602c3dd1762a977e1e185f2670e4b14e4a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66bab07afadf3ca41deb6a18b59e4602c3dd1762a977e1e185f2670e4b14e4a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"message\\\":\\\"dler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z]\\\\nI0228 04:11:32.536537 7091 services_controller.go:434] Service openshift-authentication/oauth-openshift retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{oauth-openshift openshift-authentication 327e9277-4a34-458b-9afd-a4d0b83d7a80 5000 0 2025-02-23 05:23:11 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:oauth-openshift] map[operator.openshift.io/spec-hash:d9e6d53076d47ab2d123d8b1ba8ec6543488d973dcc4e02349493cd1c33bce83 service.alpha.openshift.io/serving-cert-secret-name:v4-0-config-system-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kfpqp_openshift-ovn-kubernetes(043491df-2577-47f6-9a5b-03fecada16ce)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:34Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.976504 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.976548 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.976560 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.976574 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:34 crc kubenswrapper[5072]: I0228 04:11:34.976585 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:34Z","lastTransitionTime":"2026-02-28T04:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.078972 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.079038 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.079053 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.079077 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.079093 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:35Z","lastTransitionTime":"2026-02-28T04:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.181126 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.181175 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.181185 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.181204 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.181214 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:35Z","lastTransitionTime":"2026-02-28T04:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.283240 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.283276 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.283284 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.283298 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.283308 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:35Z","lastTransitionTime":"2026-02-28T04:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.386015 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.386049 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.386060 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.386073 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.386083 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:35Z","lastTransitionTime":"2026-02-28T04:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.487906 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.487967 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.487984 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.488007 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.488024 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:35Z","lastTransitionTime":"2026-02-28T04:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.590556 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.590748 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.590775 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.590802 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.590821 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:35Z","lastTransitionTime":"2026-02-28T04:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.668118 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.693301 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.693342 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.693351 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.693368 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.693378 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:35Z","lastTransitionTime":"2026-02-28T04:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.795271 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.795317 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.795331 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.795349 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.795363 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:35Z","lastTransitionTime":"2026-02-28T04:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.897284 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.897324 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.897332 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.897345 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.897353 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:35Z","lastTransitionTime":"2026-02-28T04:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.999356 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.999401 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.999410 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.999424 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:35 crc kubenswrapper[5072]: I0228 04:11:35.999434 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:35Z","lastTransitionTime":"2026-02-28T04:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.103418 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.103456 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.103465 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.103477 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.103488 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:36Z","lastTransitionTime":"2026-02-28T04:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.206236 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.206299 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.206315 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.206337 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.206352 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:36Z","lastTransitionTime":"2026-02-28T04:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.308548 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.308584 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.308591 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.308606 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.308615 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:36Z","lastTransitionTime":"2026-02-28T04:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.380826 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:36 crc kubenswrapper[5072]: E0228 04:11:36.381002 5072 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 04:11:36 crc kubenswrapper[5072]: E0228 04:11:36.381084 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 04:12:08.381062115 +0000 UTC m=+150.375792297 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.410446 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.410592 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.410676 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.410756 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.410843 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:36Z","lastTransitionTime":"2026-02-28T04:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.481887 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.481959 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.482006 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.482027 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:36 crc kubenswrapper[5072]: E0228 04:11:36.482129 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 04:11:36 crc kubenswrapper[5072]: E0228 04:11:36.482143 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 04:11:36 crc kubenswrapper[5072]: E0228 04:11:36.482152 5072 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:11:36 crc kubenswrapper[5072]: E0228 04:11:36.482192 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-28 04:12:08.482179136 +0000 UTC m=+150.476909328 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:11:36 crc kubenswrapper[5072]: E0228 04:11:36.482483 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:12:08.482474924 +0000 UTC m=+150.477205116 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:11:36 crc kubenswrapper[5072]: E0228 04:11:36.482527 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 04:11:36 crc kubenswrapper[5072]: E0228 04:11:36.482537 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 04:11:36 crc kubenswrapper[5072]: E0228 04:11:36.482545 5072 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:11:36 crc kubenswrapper[5072]: E0228 04:11:36.482566 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-28 04:12:08.482558826 +0000 UTC m=+150.477289018 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:11:36 crc kubenswrapper[5072]: E0228 04:11:36.482602 5072 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 04:11:36 crc kubenswrapper[5072]: E0228 04:11:36.482621 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 04:12:08.482615788 +0000 UTC m=+150.477345980 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.512963 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.513000 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.513011 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.513027 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.513040 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:36Z","lastTransitionTime":"2026-02-28T04:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.583119 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/109581ed-36ab-4625-bf7e-bcdecb30e35a-metrics-certs\") pod \"network-metrics-daemon-95gbg\" (UID: \"109581ed-36ab-4625-bf7e-bcdecb30e35a\") " pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:11:36 crc kubenswrapper[5072]: E0228 04:11:36.583309 5072 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 04:11:36 crc kubenswrapper[5072]: E0228 04:11:36.583422 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/109581ed-36ab-4625-bf7e-bcdecb30e35a-metrics-certs podName:109581ed-36ab-4625-bf7e-bcdecb30e35a nodeName:}" failed. No retries permitted until 2026-02-28 04:11:40.583394969 +0000 UTC m=+122.578125181 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/109581ed-36ab-4625-bf7e-bcdecb30e35a-metrics-certs") pod "network-metrics-daemon-95gbg" (UID: "109581ed-36ab-4625-bf7e-bcdecb30e35a") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.615600 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.615726 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.615749 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.615772 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.615787 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:36Z","lastTransitionTime":"2026-02-28T04:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.658319 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.658369 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.658406 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.658387 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:36 crc kubenswrapper[5072]: E0228 04:11:36.658465 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:11:36 crc kubenswrapper[5072]: E0228 04:11:36.658577 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:11:36 crc kubenswrapper[5072]: E0228 04:11:36.658676 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:11:36 crc kubenswrapper[5072]: E0228 04:11:36.658742 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.718032 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.718081 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.718090 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.718103 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.718114 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:36Z","lastTransitionTime":"2026-02-28T04:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.820524 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.820571 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.820579 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.820594 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.820605 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:36Z","lastTransitionTime":"2026-02-28T04:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.924152 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.924194 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.924205 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.924224 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:36 crc kubenswrapper[5072]: I0228 04:11:36.924236 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:36Z","lastTransitionTime":"2026-02-28T04:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.026623 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.026671 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.026680 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.026695 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.026705 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:37Z","lastTransitionTime":"2026-02-28T04:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.128921 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.128959 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.128968 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.128982 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.128992 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:37Z","lastTransitionTime":"2026-02-28T04:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.230972 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.231030 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.231042 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.231061 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.231075 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:37Z","lastTransitionTime":"2026-02-28T04:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.333986 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.334052 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.334062 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.334076 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.334106 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:37Z","lastTransitionTime":"2026-02-28T04:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.435614 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.435660 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.435669 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.435682 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.435690 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:37Z","lastTransitionTime":"2026-02-28T04:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.538062 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.538104 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.538130 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.538144 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.538154 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:37Z","lastTransitionTime":"2026-02-28T04:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.640419 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.640453 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.640478 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.640491 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.640499 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:37Z","lastTransitionTime":"2026-02-28T04:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.742510 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.742544 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.742552 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.742565 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.742574 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:37Z","lastTransitionTime":"2026-02-28T04:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.845921 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.845952 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.845961 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.845975 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.845984 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:37Z","lastTransitionTime":"2026-02-28T04:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.949253 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.949323 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.949365 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.949398 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:37 crc kubenswrapper[5072]: I0228 04:11:37.949422 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:37Z","lastTransitionTime":"2026-02-28T04:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.052520 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.052623 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.052677 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.052702 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.052719 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:38Z","lastTransitionTime":"2026-02-28T04:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.155449 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.155503 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.155521 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.155546 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.155566 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:38Z","lastTransitionTime":"2026-02-28T04:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.258422 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.258490 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.258508 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.258535 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.258556 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:38Z","lastTransitionTime":"2026-02-28T04:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.360935 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.360969 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.360977 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.360990 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.360999 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:38Z","lastTransitionTime":"2026-02-28T04:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.462683 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.462719 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.462730 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.462745 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.462755 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:38Z","lastTransitionTime":"2026-02-28T04:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.565886 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.566271 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.566548 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.566803 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.566997 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:38Z","lastTransitionTime":"2026-02-28T04:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.658351 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.658455 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:38 crc kubenswrapper[5072]: E0228 04:11:38.658593 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.658714 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.658957 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:38 crc kubenswrapper[5072]: E0228 04:11:38.659017 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:11:38 crc kubenswrapper[5072]: E0228 04:11:38.659119 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:11:38 crc kubenswrapper[5072]: E0228 04:11:38.659226 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:11:38 crc kubenswrapper[5072]: E0228 04:11:38.667864 5072 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.670352 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:38Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.680986 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:38Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.690285 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:38Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.698650 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:38Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.708190 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:38Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.725293 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"152bf264-e4f6-47a1-818d-291cf135c91d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8147830d1b3f4a182599f91c8c22e3a576b0eb490abb717fdb3f5d8935bed72d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81de0063ef550336022938bbee3deee5c85bd71d0fe56ebfb0cfce30359bebc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1ebfc293bb64884104f802401ae8cbd269c8237c7b152b1fb1832f631c8f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633c33aa1dff5f7df880dc47e4a688c1dd56705f9b3eda53cb1e830e91cfa791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7114e25f3e3e183fca6beb9468507c6d3276500896964a4446b4a00015c747d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:38Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.738180 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10babdb-f877-4b8b-be76-024bf945583c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8fcf3c84cc8d43452422396d9f53c179f0264e124ebbfff139954a568b40532\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55beeaa697034d33a24c08794ac6fa9c624f2af676d99123431005b3cad00a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://577dc47d1b2ce12222648c5ba704654974f41d842e674ffd3650d1e5b44dafc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:38Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:38 crc kubenswrapper[5072]: E0228 04:11:38.739908 5072 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.751617 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:38Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.761458 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95gbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"109581ed-36ab-4625-bf7e-bcdecb30e35a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95gbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:38Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.782671 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66bab07afadf3ca41deb6a18b59e4602c3dd1762a977e1e185f2670e4b14e4a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66bab07afadf3ca41deb6a18b59e4602c3dd1762a977e1e185f2670e4b14e4a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"message\\\":\\\"dler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z]\\\\nI0228 04:11:32.536537 7091 services_controller.go:434] Service openshift-authentication/oauth-openshift retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{oauth-openshift openshift-authentication 327e9277-4a34-458b-9afd-a4d0b83d7a80 5000 0 2025-02-23 05:23:11 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:oauth-openshift] map[operator.openshift.io/spec-hash:d9e6d53076d47ab2d123d8b1ba8ec6543488d973dcc4e02349493cd1c33bce83 service.alpha.openshift.io/serving-cert-secret-name:v4-0-config-system-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kfpqp_openshift-ovn-kubernetes(043491df-2577-47f6-9a5b-03fecada16ce)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:38Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.793157 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:38Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.804490 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:38Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.816369 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:38Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.827135 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:38Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.840193 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8c2e2523bd296f73003e740d2868163a7df4734c10be2c86fc1e2aa62c8352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:38Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.847959 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sbbcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad3ce336eebfb16a74b9fe4cfaba8f000d957b0174c41537a464fb9c35374b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vlnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sbbcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:38Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:38 crc kubenswrapper[5072]: I0228 04:11:38.856512 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0f43af6-3fa2-43c2-bcda-a20612fa4909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad6bb01bb4e8384cf21f3c9685f05cef09f2ff98ba02254763c92b9caba75b3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68fc83cc9ead75b24b42db57b3c2303adc3fcc02da67740a33a8c6b9d300871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7g5vc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:38Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.576251 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.576307 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.576319 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.576335 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.576347 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:40Z","lastTransitionTime":"2026-02-28T04:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:40 crc kubenswrapper[5072]: E0228 04:11:40.589978 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:40Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.594198 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.594238 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.594248 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.594261 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.594270 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:40Z","lastTransitionTime":"2026-02-28T04:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:40 crc kubenswrapper[5072]: E0228 04:11:40.606904 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:40Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.611509 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.611549 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.611559 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.611578 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.611589 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:40Z","lastTransitionTime":"2026-02-28T04:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:40 crc kubenswrapper[5072]: E0228 04:11:40.622566 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:40Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.624322 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/109581ed-36ab-4625-bf7e-bcdecb30e35a-metrics-certs\") pod \"network-metrics-daemon-95gbg\" (UID: \"109581ed-36ab-4625-bf7e-bcdecb30e35a\") " pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:11:40 crc kubenswrapper[5072]: E0228 04:11:40.624522 5072 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 04:11:40 crc kubenswrapper[5072]: E0228 04:11:40.624607 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/109581ed-36ab-4625-bf7e-bcdecb30e35a-metrics-certs podName:109581ed-36ab-4625-bf7e-bcdecb30e35a nodeName:}" failed. No retries permitted until 2026-02-28 04:11:48.62458203 +0000 UTC m=+130.619312222 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/109581ed-36ab-4625-bf7e-bcdecb30e35a-metrics-certs") pod "network-metrics-daemon-95gbg" (UID: "109581ed-36ab-4625-bf7e-bcdecb30e35a") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.627519 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.627582 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.627594 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.627611 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.627625 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:40Z","lastTransitionTime":"2026-02-28T04:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:40 crc kubenswrapper[5072]: E0228 04:11:40.638076 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:40Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.641042 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.641086 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.641096 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.641113 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.641123 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:40Z","lastTransitionTime":"2026-02-28T04:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.644311 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:11:40 crc kubenswrapper[5072]: E0228 04:11:40.652162 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:40Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:40 crc kubenswrapper[5072]: E0228 04:11:40.652278 5072 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.656944 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10babdb-f877-4b8b-be76-024bf945583c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8fcf3c84cc8d43452422396d9f53c179f0264e124ebbfff139954a568b40532\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55beeaa697034d33a24c08794ac6fa9c624f2af676d99123431005b3cad00a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://577dc47d1b2ce12222648c5ba704654974f41d842e674ffd3650d1e5b44dafc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:40Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.658063 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.658089 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.658151 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:40 crc kubenswrapper[5072]: E0228 04:11:40.658272 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.658359 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:40 crc kubenswrapper[5072]: E0228 04:11:40.658403 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:11:40 crc kubenswrapper[5072]: E0228 04:11:40.658778 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:11:40 crc kubenswrapper[5072]: E0228 04:11:40.658875 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.669551 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:40Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.670469 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.681570 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95gbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"109581ed-36ab-4625-bf7e-bcdecb30e35a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95gbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:40Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.691805 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:40Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.714853 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66bab07afadf3ca41deb6a18b59e4602c3dd1762a977e1e185f2670e4b14e4a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66bab07afadf3ca41deb6a18b59e4602c3dd1762a977e1e185f2670e4b14e4a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"message\\\":\\\"dler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z]\\\\nI0228 04:11:32.536537 7091 services_controller.go:434] Service openshift-authentication/oauth-openshift retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{oauth-openshift openshift-authentication 327e9277-4a34-458b-9afd-a4d0b83d7a80 5000 0 2025-02-23 05:23:11 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:oauth-openshift] map[operator.openshift.io/spec-hash:d9e6d53076d47ab2d123d8b1ba8ec6543488d973dcc4e02349493cd1c33bce83 service.alpha.openshift.io/serving-cert-secret-name:v4-0-config-system-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kfpqp_openshift-ovn-kubernetes(043491df-2577-47f6-9a5b-03fecada16ce)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:40Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.728034 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sbbcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad3ce336eebfb16a74b9fe4cfaba8f000d957b0174c41537a464fb9c35374b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vlnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sbbcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:40Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.739716 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0f43af6-3fa2-43c2-bcda-a20612fa4909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad6bb01bb4e8384cf21f3c9685f05cef09f2ff98ba02254763c92b9caba75b3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68fc83cc9ead75b24b42db57b3c2303adc3fcc02da67740a33a8c6b9d300871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7g5vc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:40Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.751831 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:40Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.764005 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:40Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.774518 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:40Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.786910 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8c2e2523bd296f73003e740d2868163a7df4734c10be2c86fc1e2aa62c8352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:40Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.798935 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:40Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.817313 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"152bf264-e4f6-47a1-818d-291cf135c91d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8147830d1b3f4a182599f91c8c22e3a576b0eb490abb717fdb3f5d8935bed72d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81de0063ef550336022938bbee3deee5c85bd71d0fe56ebfb0cfce30359bebc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1ebfc293bb64884104f802401ae8cbd269c8237c7b152b1fb1832f631c8f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633c33aa1dff5f7df880dc47e4a688c1dd56705f9b3eda53cb1e830e91cfa791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7114e25f3e3e183fca6beb9468507c6d3276500896964a4446b4a00015c747d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:40Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.829268 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:40Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.838794 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:40Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.848816 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:40Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:40 crc kubenswrapper[5072]: I0228 04:11:40.857000 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:40Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:42 crc kubenswrapper[5072]: I0228 04:11:42.658776 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:42 crc kubenswrapper[5072]: E0228 04:11:42.658890 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:11:42 crc kubenswrapper[5072]: I0228 04:11:42.658776 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:42 crc kubenswrapper[5072]: I0228 04:11:42.658972 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:11:42 crc kubenswrapper[5072]: E0228 04:11:42.659020 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:11:42 crc kubenswrapper[5072]: E0228 04:11:42.659104 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:11:42 crc kubenswrapper[5072]: I0228 04:11:42.659728 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:42 crc kubenswrapper[5072]: E0228 04:11:42.659805 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:11:43 crc kubenswrapper[5072]: E0228 04:11:43.740958 5072 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 04:11:44 crc kubenswrapper[5072]: I0228 04:11:44.658930 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:44 crc kubenswrapper[5072]: I0228 04:11:44.658930 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:44 crc kubenswrapper[5072]: I0228 04:11:44.659040 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:11:44 crc kubenswrapper[5072]: I0228 04:11:44.659080 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:44 crc kubenswrapper[5072]: E0228 04:11:44.659113 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:11:44 crc kubenswrapper[5072]: E0228 04:11:44.659275 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:11:44 crc kubenswrapper[5072]: E0228 04:11:44.659329 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:11:44 crc kubenswrapper[5072]: E0228 04:11:44.659391 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:11:45 crc kubenswrapper[5072]: I0228 04:11:45.658741 5072 scope.go:117] "RemoveContainer" containerID="66bab07afadf3ca41deb6a18b59e4602c3dd1762a977e1e185f2670e4b14e4a2" Feb 28 04:11:46 crc kubenswrapper[5072]: I0228 04:11:46.658528 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:46 crc kubenswrapper[5072]: I0228 04:11:46.658581 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:46 crc kubenswrapper[5072]: I0228 04:11:46.658604 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:11:46 crc kubenswrapper[5072]: I0228 04:11:46.658730 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:46 crc kubenswrapper[5072]: E0228 04:11:46.658744 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:11:46 crc kubenswrapper[5072]: E0228 04:11:46.658894 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:11:46 crc kubenswrapper[5072]: E0228 04:11:46.658974 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:11:46 crc kubenswrapper[5072]: E0228 04:11:46.659192 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:11:46 crc kubenswrapper[5072]: I0228 04:11:46.737010 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kfpqp_043491df-2577-47f6-9a5b-03fecada16ce/ovnkube-controller/1.log" Feb 28 04:11:46 crc kubenswrapper[5072]: I0228 04:11:46.739372 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" event={"ID":"043491df-2577-47f6-9a5b-03fecada16ce","Type":"ContainerStarted","Data":"81a35f15d0cc51a86876065ad4db3f6885f6082c0fcbb233219a133853081660"} Feb 28 04:11:46 crc kubenswrapper[5072]: I0228 04:11:46.739865 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:11:46 crc kubenswrapper[5072]: I0228 04:11:46.751386 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10babdb-f877-4b8b-be76-024bf945583c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8fcf3c84cc8d43452422396d9f53c179f0264e124ebbfff139954a568b40532\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55beeaa697034d33a24c08794ac6fa9c624f2af676d99123431005b3cad00a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://577dc47d1b2ce12222648c5ba704654974f41d842e674ffd3650d1e5b44dafc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:46Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:46 crc kubenswrapper[5072]: I0228 04:11:46.761527 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:46Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:46 crc kubenswrapper[5072]: I0228 04:11:46.771163 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95gbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"109581ed-36ab-4625-bf7e-bcdecb30e35a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95gbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:46Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:46 crc kubenswrapper[5072]: I0228 04:11:46.781253 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:46Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:46 crc kubenswrapper[5072]: I0228 04:11:46.798918 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a35f15d0cc51a86876065ad4db3f6885f6082c0fcbb233219a133853081660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66bab07afadf3ca41deb6a18b59e4602c3dd1762a977e1e185f2670e4b14e4a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"message\\\":\\\"dler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z]\\\\nI0228 04:11:32.536537 7091 services_controller.go:434] Service openshift-authentication/oauth-openshift retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{oauth-openshift openshift-authentication 327e9277-4a34-458b-9afd-a4d0b83d7a80 5000 0 2025-02-23 05:23:11 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:oauth-openshift] map[operator.openshift.io/spec-hash:d9e6d53076d47ab2d123d8b1ba8ec6543488d973dcc4e02349493cd1c33bce83 service.alpha.openshift.io/serving-cert-secret-name:v4-0-config-system-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:46Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:46 crc kubenswrapper[5072]: I0228 04:11:46.808411 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sbbcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad3ce336eebfb16a74b9fe4cfaba8f000d957b0174c41537a464fb9c35374b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vlnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sbbcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:46Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:46 crc kubenswrapper[5072]: I0228 04:11:46.817781 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0f43af6-3fa2-43c2-bcda-a20612fa4909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad6bb01bb4e8384cf21f3c9685f05cef09f2ff98ba02254763c92b9caba75b3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68fc83cc9ead75b24b42db57b3c2303adc3fcc02da67740a33a8c6b9d300871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7g5vc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:46Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:46 crc kubenswrapper[5072]: I0228 04:11:46.832250 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"efa3983e-a837-4c8a-9561-2f93a293272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34724f49f9d38bf481365aff1b0139a09f926c871e1a6840cfb6fb5d99358448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1950822dc64ce3c84b441fc6dc100ebf4d9669049c08f8c0ef21e6eb30f136bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0228 04:09:42.159193 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0228 04:09:42.161351 1 observer_polling.go:159] Starting file observer\\\\nI0228 04:09:42.222416 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0228 04:09:42.228824 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0228 04:10:07.970624 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0228 04:10:07.970713 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:10:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c229773678d8bee8898338eab595e8a47dba68ae424e54532ffc8eab7f077005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c20aeec9468a4876f5db381201080c44af00d4d5f79f73541faec3a8c7e5dd7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3fdbc0636689b1c505ce29b494474b17f8d7b1a1c40ae4a7df22a4e0e43bfca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:46Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:46 crc kubenswrapper[5072]: I0228 04:11:46.844057 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:46Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:46 crc kubenswrapper[5072]: I0228 04:11:46.858244 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:46Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:46 crc kubenswrapper[5072]: I0228 04:11:46.871502 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:46Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:46 crc kubenswrapper[5072]: I0228 04:11:46.886320 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8c2e2523bd296f73003e740d2868163a7df4734c10be2c86fc1e2aa62c8352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:46Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:46 crc kubenswrapper[5072]: I0228 04:11:46.899530 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:46Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:46 crc kubenswrapper[5072]: I0228 04:11:46.918065 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"152bf264-e4f6-47a1-818d-291cf135c91d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8147830d1b3f4a182599f91c8c22e3a576b0eb490abb717fdb3f5d8935bed72d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81de0063ef550336022938bbee3deee5c85bd71d0fe56ebfb0cfce30359bebc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1ebfc293bb64884104f802401ae8cbd269c8237c7b152b1fb1832f631c8f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633c33aa1dff5f7df880dc47e4a688c1dd56705f9b3eda53cb1e830e91cfa791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7114e25f3e3e183fca6beb9468507c6d3276500896964a4446b4a00015c747d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:46Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:46 crc kubenswrapper[5072]: I0228 04:11:46.930509 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:46Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:46 crc kubenswrapper[5072]: I0228 04:11:46.941362 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:46Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:46 crc kubenswrapper[5072]: I0228 04:11:46.951309 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:46Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:46 crc kubenswrapper[5072]: I0228 04:11:46.960268 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:46Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:47 crc kubenswrapper[5072]: I0228 04:11:47.743806 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kfpqp_043491df-2577-47f6-9a5b-03fecada16ce/ovnkube-controller/2.log" Feb 28 04:11:47 crc kubenswrapper[5072]: I0228 04:11:47.744294 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kfpqp_043491df-2577-47f6-9a5b-03fecada16ce/ovnkube-controller/1.log" Feb 28 04:11:47 crc kubenswrapper[5072]: I0228 04:11:47.746900 5072 generic.go:334] "Generic (PLEG): container finished" podID="043491df-2577-47f6-9a5b-03fecada16ce" containerID="81a35f15d0cc51a86876065ad4db3f6885f6082c0fcbb233219a133853081660" exitCode=1 Feb 28 04:11:47 crc kubenswrapper[5072]: I0228 04:11:47.746953 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" event={"ID":"043491df-2577-47f6-9a5b-03fecada16ce","Type":"ContainerDied","Data":"81a35f15d0cc51a86876065ad4db3f6885f6082c0fcbb233219a133853081660"} Feb 28 04:11:47 crc kubenswrapper[5072]: I0228 04:11:47.747003 5072 scope.go:117] "RemoveContainer" containerID="66bab07afadf3ca41deb6a18b59e4602c3dd1762a977e1e185f2670e4b14e4a2" Feb 28 04:11:47 crc kubenswrapper[5072]: I0228 04:11:47.747483 5072 scope.go:117] "RemoveContainer" containerID="81a35f15d0cc51a86876065ad4db3f6885f6082c0fcbb233219a133853081660" Feb 28 04:11:47 crc kubenswrapper[5072]: E0228 04:11:47.747741 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kfpqp_openshift-ovn-kubernetes(043491df-2577-47f6-9a5b-03fecada16ce)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" podUID="043491df-2577-47f6-9a5b-03fecada16ce" Feb 28 04:11:47 crc kubenswrapper[5072]: I0228 04:11:47.762959 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"efa3983e-a837-4c8a-9561-2f93a293272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34724f49f9d38bf481365aff1b0139a09f926c871e1a6840cfb6fb5d99358448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1950822dc64ce3c84b441fc6dc100ebf4d9669049c08f8c0ef21e6eb30f136bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0228 04:09:42.159193 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0228 04:09:42.161351 1 observer_polling.go:159] Starting file observer\\\\nI0228 04:09:42.222416 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0228 04:09:42.228824 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0228 04:10:07.970624 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0228 04:10:07.970713 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:10:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c229773678d8bee8898338eab595e8a47dba68ae424e54532ffc8eab7f077005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c20aeec9468a4876f5db381201080c44af00d4d5f79f73541faec3a8c7e5dd7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3fdbc0636689b1c505ce29b494474b17f8d7b1a1c40ae4a7df22a4e0e43bfca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:47Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:47 crc kubenswrapper[5072]: I0228 04:11:47.775067 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:47Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:47 crc kubenswrapper[5072]: I0228 04:11:47.786872 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:47Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:47 crc kubenswrapper[5072]: I0228 04:11:47.796714 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:47Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:47 crc kubenswrapper[5072]: I0228 04:11:47.808671 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8c2e2523bd296f73003e740d2868163a7df4734c10be2c86fc1e2aa62c8352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:47Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:47 crc kubenswrapper[5072]: I0228 04:11:47.816931 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sbbcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad3ce336eebfb16a74b9fe4cfaba8f000d957b0174c41537a464fb9c35374b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vlnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sbbcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:47Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:47 crc kubenswrapper[5072]: I0228 04:11:47.827759 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0f43af6-3fa2-43c2-bcda-a20612fa4909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad6bb01bb4e8384cf21f3c9685f05cef09f2ff98ba02254763c92b9caba75b3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68fc83cc9ead75b24b42db57b3c2303adc3fcc02da67740a33a8c6b9d300871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7g5vc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:47Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:47 crc kubenswrapper[5072]: I0228 04:11:47.844846 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"152bf264-e4f6-47a1-818d-291cf135c91d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8147830d1b3f4a182599f91c8c22e3a576b0eb490abb717fdb3f5d8935bed72d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81de0063ef550336022938bbee3deee5c85bd71d0fe56ebfb0cfce30359bebc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1ebfc293bb64884104f802401ae8cbd269c8237c7b152b1fb1832f631c8f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633c33aa1dff5f7df880dc47e4a688c1dd56705f9b3eda53cb1e830e91cfa791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7114e25f3e3e183fca6beb9468507c6d3276500896964a4446b4a00015c747d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:47Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:47 crc kubenswrapper[5072]: I0228 04:11:47.855713 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:47Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:47 crc kubenswrapper[5072]: I0228 04:11:47.865722 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:47Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:47 crc kubenswrapper[5072]: I0228 04:11:47.876286 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:47Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:47 crc kubenswrapper[5072]: I0228 04:11:47.884182 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:47Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:47 crc kubenswrapper[5072]: I0228 04:11:47.895382 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:47Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:47 crc kubenswrapper[5072]: I0228 04:11:47.905807 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10babdb-f877-4b8b-be76-024bf945583c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8fcf3c84cc8d43452422396d9f53c179f0264e124ebbfff139954a568b40532\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55beeaa697034d33a24c08794ac6fa9c624f2af676d99123431005b3cad00a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://577dc47d1b2ce12222648c5ba704654974f41d842e674ffd3650d1e5b44dafc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:47Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:47 crc kubenswrapper[5072]: I0228 04:11:47.915843 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:47Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:47 crc kubenswrapper[5072]: I0228 04:11:47.925127 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95gbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"109581ed-36ab-4625-bf7e-bcdecb30e35a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95gbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:47Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:47 crc kubenswrapper[5072]: I0228 04:11:47.934409 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:47Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:47 crc kubenswrapper[5072]: I0228 04:11:47.951098 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a35f15d0cc51a86876065ad4db3f6885f6082c0fcbb233219a133853081660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66bab07afadf3ca41deb6a18b59e4602c3dd1762a977e1e185f2670e4b14e4a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"message\\\":\\\"dler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z]\\\\nI0228 04:11:32.536537 7091 services_controller.go:434] Service openshift-authentication/oauth-openshift retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{oauth-openshift openshift-authentication 327e9277-4a34-458b-9afd-a4d0b83d7a80 5000 0 2025-02-23 05:23:11 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:oauth-openshift] map[operator.openshift.io/spec-hash:d9e6d53076d47ab2d123d8b1ba8ec6543488d973dcc4e02349493cd1c33bce83 service.alpha.openshift.io/serving-cert-secret-name:v4-0-config-system-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81a35f15d0cc51a86876065ad4db3f6885f6082c0fcbb233219a133853081660\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:11:46Z\\\",\\\"message\\\":\\\"/externalversions/factory.go:140\\\\nI0228 04:11:46.731745 7332 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:11:46.732141 7332 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:11:46.732157 7332 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:11:46.733751 7332 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0228 04:11:46.733851 7332 factory.go:656] Stopping watch factory\\\\nI0228 04:11:46.734687 7332 handler.go:208] Removed *v1.Node event handler 2\\\\nI0228 04:11:46.742622 7332 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0228 04:11:46.742708 7332 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0228 04:11:46.742785 7332 ovnkube.go:599] Stopped ovnkube\\\\nI0228 04:11:46.742827 7332 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0228 04:11:46.742933 7332 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:47Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.658343 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:11:48 crc kubenswrapper[5072]: E0228 04:11:48.658496 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.658911 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.659000 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:48 crc kubenswrapper[5072]: E0228 04:11:48.659072 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.659082 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:48 crc kubenswrapper[5072]: E0228 04:11:48.659160 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:11:48 crc kubenswrapper[5072]: E0228 04:11:48.659231 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.679616 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"152bf264-e4f6-47a1-818d-291cf135c91d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8147830d1b3f4a182599f91c8c22e3a576b0eb490abb717fdb3f5d8935bed72d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81de0063ef550336022938bbee3deee5c85bd71d0fe56ebfb0cfce30359bebc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1ebfc293bb64884104f802401ae8cbd269c8237c7b152b1fb1832f631c8f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633c33aa1dff5f7df880dc47e4a688c1dd56705f9b3eda53cb1e830e91cfa791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7114e25f3e3e183fca6beb9468507c6d3276500896964a4446b4a00015c747d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:48Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.692054 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:48Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.702752 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:48Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.712083 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/109581ed-36ab-4625-bf7e-bcdecb30e35a-metrics-certs\") pod \"network-metrics-daemon-95gbg\" (UID: \"109581ed-36ab-4625-bf7e-bcdecb30e35a\") " pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:11:48 crc kubenswrapper[5072]: E0228 04:11:48.712213 5072 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 04:11:48 crc kubenswrapper[5072]: E0228 04:11:48.712261 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/109581ed-36ab-4625-bf7e-bcdecb30e35a-metrics-certs podName:109581ed-36ab-4625-bf7e-bcdecb30e35a nodeName:}" failed. No retries permitted until 2026-02-28 04:12:04.712249441 +0000 UTC m=+146.706979633 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/109581ed-36ab-4625-bf7e-bcdecb30e35a-metrics-certs") pod "network-metrics-daemon-95gbg" (UID: "109581ed-36ab-4625-bf7e-bcdecb30e35a") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.713044 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:48Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.722194 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:48Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.733514 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:48Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:48 crc kubenswrapper[5072]: E0228 04:11:48.741887 5072 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.746330 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10babdb-f877-4b8b-be76-024bf945583c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8fcf3c84cc8d43452422396d9f53c179f0264e124ebbfff139954a568b40532\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55beeaa697034d33a24c08794ac6fa9c624f2af676d99123431005b3cad00a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://577dc47d1b2ce12222648c5ba704654974f41d842e674ffd3650d1e5b44dafc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:48Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.751422 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kfpqp_043491df-2577-47f6-9a5b-03fecada16ce/ovnkube-controller/2.log" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.755512 5072 scope.go:117] "RemoveContainer" containerID="81a35f15d0cc51a86876065ad4db3f6885f6082c0fcbb233219a133853081660" Feb 28 04:11:48 crc kubenswrapper[5072]: E0228 04:11:48.755682 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kfpqp_openshift-ovn-kubernetes(043491df-2577-47f6-9a5b-03fecada16ce)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" podUID="043491df-2577-47f6-9a5b-03fecada16ce" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.760094 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:48Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.769183 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95gbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"109581ed-36ab-4625-bf7e-bcdecb30e35a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95gbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:48Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.778687 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:48Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.793684 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a35f15d0cc51a86876065ad4db3f6885f6082c0fcbb233219a133853081660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66bab07afadf3ca41deb6a18b59e4602c3dd1762a977e1e185f2670e4b14e4a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"message\\\":\\\"dler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:32Z is after 2025-08-24T17:21:41Z]\\\\nI0228 04:11:32.536537 7091 services_controller.go:434] Service openshift-authentication/oauth-openshift retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{oauth-openshift openshift-authentication 327e9277-4a34-458b-9afd-a4d0b83d7a80 5000 0 2025-02-23 05:23:11 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:oauth-openshift] map[operator.openshift.io/spec-hash:d9e6d53076d47ab2d123d8b1ba8ec6543488d973dcc4e02349493cd1c33bce83 service.alpha.openshift.io/serving-cert-secret-name:v4-0-config-system-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81a35f15d0cc51a86876065ad4db3f6885f6082c0fcbb233219a133853081660\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:11:46Z\\\",\\\"message\\\":\\\"/externalversions/factory.go:140\\\\nI0228 04:11:46.731745 7332 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:11:46.732141 7332 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:11:46.732157 7332 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:11:46.733751 7332 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0228 04:11:46.733851 7332 factory.go:656] Stopping watch factory\\\\nI0228 04:11:46.734687 7332 handler.go:208] Removed *v1.Node event handler 2\\\\nI0228 04:11:46.742622 7332 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0228 04:11:46.742708 7332 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0228 04:11:46.742785 7332 ovnkube.go:599] Stopped ovnkube\\\\nI0228 04:11:46.742827 7332 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0228 04:11:46.742933 7332 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:48Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.802413 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0f43af6-3fa2-43c2-bcda-a20612fa4909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad6bb01bb4e8384cf21f3c9685f05cef09f2ff98ba02254763c92b9caba75b3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68fc83cc9ead75b24b42db57b3c2303adc3fcc02da67740a33a8c6b9d300871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7g5vc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:48Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.811817 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"efa3983e-a837-4c8a-9561-2f93a293272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34724f49f9d38bf481365aff1b0139a09f926c871e1a6840cfb6fb5d99358448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1950822dc64ce3c84b441fc6dc100ebf4d9669049c08f8c0ef21e6eb30f136bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0228 04:09:42.159193 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0228 04:09:42.161351 1 observer_polling.go:159] Starting file observer\\\\nI0228 04:09:42.222416 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0228 04:09:42.228824 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0228 04:10:07.970624 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0228 04:10:07.970713 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:10:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c229773678d8bee8898338eab595e8a47dba68ae424e54532ffc8eab7f077005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c20aeec9468a4876f5db381201080c44af00d4d5f79f73541faec3a8c7e5dd7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3fdbc0636689b1c505ce29b494474b17f8d7b1a1c40ae4a7df22a4e0e43bfca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:48Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.821214 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:48Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.833072 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:48Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.846167 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:48Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.858920 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8c2e2523bd296f73003e740d2868163a7df4734c10be2c86fc1e2aa62c8352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:48Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.869598 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sbbcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad3ce336eebfb16a74b9fe4cfaba8f000d957b0174c41537a464fb9c35374b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vlnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sbbcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:48Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.878938 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:48Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.895797 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a35f15d0cc51a86876065ad4db3f6885f6082c0fcbb233219a133853081660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81a35f15d0cc51a86876065ad4db3f6885f6082c0fcbb233219a133853081660\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:11:46Z\\\",\\\"message\\\":\\\"/externalversions/factory.go:140\\\\nI0228 04:11:46.731745 7332 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:11:46.732141 7332 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:11:46.732157 7332 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:11:46.733751 7332 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0228 04:11:46.733851 7332 factory.go:656] Stopping watch factory\\\\nI0228 04:11:46.734687 7332 handler.go:208] Removed *v1.Node event handler 2\\\\nI0228 04:11:46.742622 7332 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0228 04:11:46.742708 7332 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0228 04:11:46.742785 7332 ovnkube.go:599] Stopped ovnkube\\\\nI0228 04:11:46.742827 7332 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0228 04:11:46.742933 7332 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kfpqp_openshift-ovn-kubernetes(043491df-2577-47f6-9a5b-03fecada16ce)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:48Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.906836 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"efa3983e-a837-4c8a-9561-2f93a293272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34724f49f9d38bf481365aff1b0139a09f926c871e1a6840cfb6fb5d99358448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1950822dc64ce3c84b441fc6dc100ebf4d9669049c08f8c0ef21e6eb30f136bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0228 04:09:42.159193 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0228 04:09:42.161351 1 observer_polling.go:159] Starting file observer\\\\nI0228 04:09:42.222416 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0228 04:09:42.228824 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0228 04:10:07.970624 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0228 04:10:07.970713 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:10:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c229773678d8bee8898338eab595e8a47dba68ae424e54532ffc8eab7f077005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c20aeec9468a4876f5db381201080c44af00d4d5f79f73541faec3a8c7e5dd7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3fdbc0636689b1c505ce29b494474b17f8d7b1a1c40ae4a7df22a4e0e43bfca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:48Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.917852 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:48Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.930849 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:48Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.943677 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:48Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.959488 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8c2e2523bd296f73003e740d2868163a7df4734c10be2c86fc1e2aa62c8352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:48Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.969794 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sbbcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad3ce336eebfb16a74b9fe4cfaba8f000d957b0174c41537a464fb9c35374b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vlnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sbbcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:48Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:48 crc kubenswrapper[5072]: I0228 04:11:48.982127 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0f43af6-3fa2-43c2-bcda-a20612fa4909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad6bb01bb4e8384cf21f3c9685f05cef09f2ff98ba02254763c92b9caba75b3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68fc83cc9ead75b24b42db57b3c2303adc3fcc02da67740a33a8c6b9d300871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7g5vc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:48Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:49 crc kubenswrapper[5072]: I0228 04:11:49.000567 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"152bf264-e4f6-47a1-818d-291cf135c91d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8147830d1b3f4a182599f91c8c22e3a576b0eb490abb717fdb3f5d8935bed72d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81de0063ef550336022938bbee3deee5c85bd71d0fe56ebfb0cfce30359bebc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1ebfc293bb64884104f802401ae8cbd269c8237c7b152b1fb1832f631c8f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633c33aa1dff5f7df880dc47e4a688c1dd56705f9b3eda53cb1e830e91cfa791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7114e25f3e3e183fca6beb9468507c6d3276500896964a4446b4a00015c747d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:48Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:49 crc kubenswrapper[5072]: I0228 04:11:49.013982 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:49Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:49 crc kubenswrapper[5072]: I0228 04:11:49.027047 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:49Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:49 crc kubenswrapper[5072]: I0228 04:11:49.039709 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:49Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:49 crc kubenswrapper[5072]: I0228 04:11:49.049824 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:49Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:49 crc kubenswrapper[5072]: I0228 04:11:49.061085 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:49Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:49 crc kubenswrapper[5072]: I0228 04:11:49.071277 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10babdb-f877-4b8b-be76-024bf945583c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8fcf3c84cc8d43452422396d9f53c179f0264e124ebbfff139954a568b40532\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55beeaa697034d33a24c08794ac6fa9c624f2af676d99123431005b3cad00a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://577dc47d1b2ce12222648c5ba704654974f41d842e674ffd3650d1e5b44dafc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:49Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:49 crc kubenswrapper[5072]: I0228 04:11:49.081325 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:49Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:49 crc kubenswrapper[5072]: I0228 04:11:49.089926 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95gbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"109581ed-36ab-4625-bf7e-bcdecb30e35a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95gbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:49Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:50 crc kubenswrapper[5072]: I0228 04:11:50.658596 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:50 crc kubenswrapper[5072]: I0228 04:11:50.658677 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:50 crc kubenswrapper[5072]: E0228 04:11:50.658754 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:11:50 crc kubenswrapper[5072]: E0228 04:11:50.658885 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:11:50 crc kubenswrapper[5072]: I0228 04:11:50.659039 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:11:50 crc kubenswrapper[5072]: E0228 04:11:50.659134 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:11:50 crc kubenswrapper[5072]: I0228 04:11:50.659838 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:50 crc kubenswrapper[5072]: E0228 04:11:50.660056 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:11:51 crc kubenswrapper[5072]: I0228 04:11:51.025523 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:51 crc kubenswrapper[5072]: I0228 04:11:51.025841 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:51 crc kubenswrapper[5072]: I0228 04:11:51.025858 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:51 crc kubenswrapper[5072]: I0228 04:11:51.025926 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:51 crc kubenswrapper[5072]: I0228 04:11:51.025945 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:51Z","lastTransitionTime":"2026-02-28T04:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:51 crc kubenswrapper[5072]: E0228 04:11:51.040875 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:51Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:51 crc kubenswrapper[5072]: I0228 04:11:51.044277 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:51 crc kubenswrapper[5072]: I0228 04:11:51.044309 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:51 crc kubenswrapper[5072]: I0228 04:11:51.044335 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:51 crc kubenswrapper[5072]: I0228 04:11:51.044351 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:51 crc kubenswrapper[5072]: I0228 04:11:51.044361 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:51Z","lastTransitionTime":"2026-02-28T04:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:51 crc kubenswrapper[5072]: E0228 04:11:51.055328 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:51Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:51 crc kubenswrapper[5072]: I0228 04:11:51.059151 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:51 crc kubenswrapper[5072]: I0228 04:11:51.059183 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:51 crc kubenswrapper[5072]: I0228 04:11:51.059193 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:51 crc kubenswrapper[5072]: I0228 04:11:51.059205 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:51 crc kubenswrapper[5072]: I0228 04:11:51.059215 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:51Z","lastTransitionTime":"2026-02-28T04:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:51 crc kubenswrapper[5072]: E0228 04:11:51.071456 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:51Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:51 crc kubenswrapper[5072]: I0228 04:11:51.074725 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:51 crc kubenswrapper[5072]: I0228 04:11:51.074761 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:51 crc kubenswrapper[5072]: I0228 04:11:51.074770 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:51 crc kubenswrapper[5072]: I0228 04:11:51.074785 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:51 crc kubenswrapper[5072]: I0228 04:11:51.074797 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:51Z","lastTransitionTime":"2026-02-28T04:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:51 crc kubenswrapper[5072]: E0228 04:11:51.088364 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:51Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:51 crc kubenswrapper[5072]: I0228 04:11:51.093462 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:11:51 crc kubenswrapper[5072]: I0228 04:11:51.093508 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:11:51 crc kubenswrapper[5072]: I0228 04:11:51.093518 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:11:51 crc kubenswrapper[5072]: I0228 04:11:51.093534 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:11:51 crc kubenswrapper[5072]: I0228 04:11:51.093546 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:11:51Z","lastTransitionTime":"2026-02-28T04:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:11:51 crc kubenswrapper[5072]: E0228 04:11:51.108105 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:51Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:51 crc kubenswrapper[5072]: E0228 04:11:51.108264 5072 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 28 04:11:52 crc kubenswrapper[5072]: I0228 04:11:52.658131 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:52 crc kubenswrapper[5072]: I0228 04:11:52.658261 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:11:52 crc kubenswrapper[5072]: E0228 04:11:52.658412 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:11:52 crc kubenswrapper[5072]: I0228 04:11:52.658454 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:52 crc kubenswrapper[5072]: E0228 04:11:52.658552 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:11:52 crc kubenswrapper[5072]: E0228 04:11:52.658594 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:11:52 crc kubenswrapper[5072]: I0228 04:11:52.658759 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:52 crc kubenswrapper[5072]: E0228 04:11:52.658877 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:11:53 crc kubenswrapper[5072]: E0228 04:11:53.742883 5072 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 04:11:54 crc kubenswrapper[5072]: I0228 04:11:54.658280 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:54 crc kubenswrapper[5072]: I0228 04:11:54.658363 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:11:54 crc kubenswrapper[5072]: E0228 04:11:54.658455 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:11:54 crc kubenswrapper[5072]: I0228 04:11:54.658494 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:54 crc kubenswrapper[5072]: I0228 04:11:54.658280 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:54 crc kubenswrapper[5072]: E0228 04:11:54.658586 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:11:54 crc kubenswrapper[5072]: E0228 04:11:54.658790 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:11:54 crc kubenswrapper[5072]: E0228 04:11:54.658885 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:11:56 crc kubenswrapper[5072]: I0228 04:11:56.658339 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:56 crc kubenswrapper[5072]: I0228 04:11:56.658357 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:56 crc kubenswrapper[5072]: E0228 04:11:56.658474 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:11:56 crc kubenswrapper[5072]: I0228 04:11:56.658556 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:56 crc kubenswrapper[5072]: I0228 04:11:56.658612 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:11:56 crc kubenswrapper[5072]: E0228 04:11:56.658736 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:11:56 crc kubenswrapper[5072]: E0228 04:11:56.658913 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:11:56 crc kubenswrapper[5072]: E0228 04:11:56.659211 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:11:58 crc kubenswrapper[5072]: I0228 04:11:58.658358 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:11:58 crc kubenswrapper[5072]: I0228 04:11:58.658420 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:11:58 crc kubenswrapper[5072]: I0228 04:11:58.658397 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:11:58 crc kubenswrapper[5072]: I0228 04:11:58.658385 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:11:58 crc kubenswrapper[5072]: E0228 04:11:58.658580 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:11:58 crc kubenswrapper[5072]: E0228 04:11:58.658613 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:11:58 crc kubenswrapper[5072]: E0228 04:11:58.658720 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:11:58 crc kubenswrapper[5072]: E0228 04:11:58.658874 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:11:58 crc kubenswrapper[5072]: I0228 04:11:58.685835 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:58Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:58 crc kubenswrapper[5072]: I0228 04:11:58.702428 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:58Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:58 crc kubenswrapper[5072]: I0228 04:11:58.715085 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:58Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:58 crc kubenswrapper[5072]: I0228 04:11:58.732828 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:58Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:58 crc kubenswrapper[5072]: E0228 04:11:58.743495 5072 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 04:11:58 crc kubenswrapper[5072]: I0228 04:11:58.750927 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"152bf264-e4f6-47a1-818d-291cf135c91d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8147830d1b3f4a182599f91c8c22e3a576b0eb490abb717fdb3f5d8935bed72d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81de0063ef550336022938bbee3deee5c85bd71d0fe56ebfb0cfce30359bebc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1ebfc293bb64884104f802401ae8cbd269c8237c7b152b1fb1832f631c8f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633c33aa1dff5f7df880dc47e4a688c1dd56705f9b3eda53cb1e830e91cfa791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7114e25f3e3e183fca6beb9468507c6d3276500896964a4446b4a00015c747d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:58Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:58 crc kubenswrapper[5072]: I0228 04:11:58.769305 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:58Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:58 crc kubenswrapper[5072]: I0228 04:11:58.780805 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:58Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:58 crc kubenswrapper[5072]: I0228 04:11:58.791329 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95gbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"109581ed-36ab-4625-bf7e-bcdecb30e35a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95gbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:58Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:58 crc kubenswrapper[5072]: I0228 04:11:58.803764 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10babdb-f877-4b8b-be76-024bf945583c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8fcf3c84cc8d43452422396d9f53c179f0264e124ebbfff139954a568b40532\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55beeaa697034d33a24c08794ac6fa9c624f2af676d99123431005b3cad00a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://577dc47d1b2ce12222648c5ba704654974f41d842e674ffd3650d1e5b44dafc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:58Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:58 crc kubenswrapper[5072]: I0228 04:11:58.815795 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:58Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:58 crc kubenswrapper[5072]: I0228 04:11:58.850762 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a35f15d0cc51a86876065ad4db3f6885f6082c0fcbb233219a133853081660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81a35f15d0cc51a86876065ad4db3f6885f6082c0fcbb233219a133853081660\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:11:46Z\\\",\\\"message\\\":\\\"/externalversions/factory.go:140\\\\nI0228 04:11:46.731745 7332 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:11:46.732141 7332 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:11:46.732157 7332 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:11:46.733751 7332 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0228 04:11:46.733851 7332 factory.go:656] Stopping watch factory\\\\nI0228 04:11:46.734687 7332 handler.go:208] Removed *v1.Node event handler 2\\\\nI0228 04:11:46.742622 7332 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0228 04:11:46.742708 7332 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0228 04:11:46.742785 7332 ovnkube.go:599] Stopped ovnkube\\\\nI0228 04:11:46.742827 7332 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0228 04:11:46.742933 7332 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kfpqp_openshift-ovn-kubernetes(043491df-2577-47f6-9a5b-03fecada16ce)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:58Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:58 crc kubenswrapper[5072]: I0228 04:11:58.882533 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:58Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:58 crc kubenswrapper[5072]: I0228 04:11:58.892663 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:58Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:58 crc kubenswrapper[5072]: I0228 04:11:58.905075 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8c2e2523bd296f73003e740d2868163a7df4734c10be2c86fc1e2aa62c8352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:58Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:58 crc kubenswrapper[5072]: I0228 04:11:58.914802 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sbbcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad3ce336eebfb16a74b9fe4cfaba8f000d957b0174c41537a464fb9c35374b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vlnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sbbcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:58Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:58 crc kubenswrapper[5072]: I0228 04:11:58.924337 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0f43af6-3fa2-43c2-bcda-a20612fa4909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad6bb01bb4e8384cf21f3c9685f05cef09f2ff98ba02254763c92b9caba75b3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68fc83cc9ead75b24b42db57b3c2303adc3fcc02da67740a33a8c6b9d300871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7g5vc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:58Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:58 crc kubenswrapper[5072]: I0228 04:11:58.936217 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"efa3983e-a837-4c8a-9561-2f93a293272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34724f49f9d38bf481365aff1b0139a09f926c871e1a6840cfb6fb5d99358448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1950822dc64ce3c84b441fc6dc100ebf4d9669049c08f8c0ef21e6eb30f136bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0228 04:09:42.159193 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0228 04:09:42.161351 1 observer_polling.go:159] Starting file observer\\\\nI0228 04:09:42.222416 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0228 04:09:42.228824 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0228 04:10:07.970624 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0228 04:10:07.970713 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:10:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c229773678d8bee8898338eab595e8a47dba68ae424e54532ffc8eab7f077005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c20aeec9468a4876f5db381201080c44af00d4d5f79f73541faec3a8c7e5dd7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3fdbc0636689b1c505ce29b494474b17f8d7b1a1c40ae4a7df22a4e0e43bfca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:58Z is after 2025-08-24T17:21:41Z" Feb 28 04:11:58 crc kubenswrapper[5072]: I0228 04:11:58.947716 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:11:58Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:00 crc kubenswrapper[5072]: I0228 04:12:00.658310 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:12:00 crc kubenswrapper[5072]: I0228 04:12:00.658355 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:12:00 crc kubenswrapper[5072]: I0228 04:12:00.658314 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:00 crc kubenswrapper[5072]: E0228 04:12:00.658435 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:12:00 crc kubenswrapper[5072]: I0228 04:12:00.658310 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:00 crc kubenswrapper[5072]: E0228 04:12:00.658534 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:12:00 crc kubenswrapper[5072]: E0228 04:12:00.658600 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:12:00 crc kubenswrapper[5072]: E0228 04:12:00.658680 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:12:01 crc kubenswrapper[5072]: I0228 04:12:01.275203 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:12:01 crc kubenswrapper[5072]: I0228 04:12:01.275243 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:12:01 crc kubenswrapper[5072]: I0228 04:12:01.275255 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:12:01 crc kubenswrapper[5072]: I0228 04:12:01.275315 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:12:01 crc kubenswrapper[5072]: I0228 04:12:01.275328 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:12:01Z","lastTransitionTime":"2026-02-28T04:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:12:01 crc kubenswrapper[5072]: E0228 04:12:01.292028 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:01Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:01 crc kubenswrapper[5072]: I0228 04:12:01.297095 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:12:01 crc kubenswrapper[5072]: I0228 04:12:01.297132 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:12:01 crc kubenswrapper[5072]: I0228 04:12:01.297145 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:12:01 crc kubenswrapper[5072]: I0228 04:12:01.297164 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:12:01 crc kubenswrapper[5072]: I0228 04:12:01.297177 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:12:01Z","lastTransitionTime":"2026-02-28T04:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:12:01 crc kubenswrapper[5072]: E0228 04:12:01.311136 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:01Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:01 crc kubenswrapper[5072]: I0228 04:12:01.315460 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:12:01 crc kubenswrapper[5072]: I0228 04:12:01.315508 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:12:01 crc kubenswrapper[5072]: I0228 04:12:01.315520 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:12:01 crc kubenswrapper[5072]: I0228 04:12:01.315538 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:12:01 crc kubenswrapper[5072]: I0228 04:12:01.315548 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:12:01Z","lastTransitionTime":"2026-02-28T04:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:12:01 crc kubenswrapper[5072]: E0228 04:12:01.328672 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:01Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:01 crc kubenswrapper[5072]: I0228 04:12:01.332252 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:12:01 crc kubenswrapper[5072]: I0228 04:12:01.332411 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:12:01 crc kubenswrapper[5072]: I0228 04:12:01.332491 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:12:01 crc kubenswrapper[5072]: I0228 04:12:01.332579 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:12:01 crc kubenswrapper[5072]: I0228 04:12:01.332687 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:12:01Z","lastTransitionTime":"2026-02-28T04:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:12:01 crc kubenswrapper[5072]: E0228 04:12:01.345275 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:01Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:01 crc kubenswrapper[5072]: I0228 04:12:01.349408 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:12:01 crc kubenswrapper[5072]: I0228 04:12:01.349464 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:12:01 crc kubenswrapper[5072]: I0228 04:12:01.349477 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:12:01 crc kubenswrapper[5072]: I0228 04:12:01.349493 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:12:01 crc kubenswrapper[5072]: I0228 04:12:01.349503 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:12:01Z","lastTransitionTime":"2026-02-28T04:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:12:01 crc kubenswrapper[5072]: E0228 04:12:01.360972 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:01Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:01 crc kubenswrapper[5072]: E0228 04:12:01.361088 5072 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 28 04:12:02 crc kubenswrapper[5072]: I0228 04:12:02.658984 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:12:02 crc kubenswrapper[5072]: I0228 04:12:02.659015 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:02 crc kubenswrapper[5072]: E0228 04:12:02.659129 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:12:02 crc kubenswrapper[5072]: I0228 04:12:02.659002 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:02 crc kubenswrapper[5072]: I0228 04:12:02.658985 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:12:02 crc kubenswrapper[5072]: E0228 04:12:02.659412 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:12:02 crc kubenswrapper[5072]: E0228 04:12:02.659219 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:12:02 crc kubenswrapper[5072]: E0228 04:12:02.659467 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:12:03 crc kubenswrapper[5072]: I0228 04:12:03.659429 5072 scope.go:117] "RemoveContainer" containerID="81a35f15d0cc51a86876065ad4db3f6885f6082c0fcbb233219a133853081660" Feb 28 04:12:03 crc kubenswrapper[5072]: E0228 04:12:03.659752 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kfpqp_openshift-ovn-kubernetes(043491df-2577-47f6-9a5b-03fecada16ce)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" podUID="043491df-2577-47f6-9a5b-03fecada16ce" Feb 28 04:12:03 crc kubenswrapper[5072]: E0228 04:12:03.744887 5072 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 04:12:04 crc kubenswrapper[5072]: I0228 04:12:04.658934 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:04 crc kubenswrapper[5072]: I0228 04:12:04.659025 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:12:04 crc kubenswrapper[5072]: I0228 04:12:04.659039 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:04 crc kubenswrapper[5072]: I0228 04:12:04.659040 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:12:04 crc kubenswrapper[5072]: E0228 04:12:04.659191 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:12:04 crc kubenswrapper[5072]: E0228 04:12:04.659359 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:12:04 crc kubenswrapper[5072]: E0228 04:12:04.659509 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:12:04 crc kubenswrapper[5072]: E0228 04:12:04.659664 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:12:04 crc kubenswrapper[5072]: I0228 04:12:04.763366 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/109581ed-36ab-4625-bf7e-bcdecb30e35a-metrics-certs\") pod \"network-metrics-daemon-95gbg\" (UID: \"109581ed-36ab-4625-bf7e-bcdecb30e35a\") " pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:04 crc kubenswrapper[5072]: E0228 04:12:04.763594 5072 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 04:12:04 crc kubenswrapper[5072]: E0228 04:12:04.763719 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/109581ed-36ab-4625-bf7e-bcdecb30e35a-metrics-certs podName:109581ed-36ab-4625-bf7e-bcdecb30e35a nodeName:}" failed. No retries permitted until 2026-02-28 04:12:36.763694727 +0000 UTC m=+178.758424939 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/109581ed-36ab-4625-bf7e-bcdecb30e35a-metrics-certs") pod "network-metrics-daemon-95gbg" (UID: "109581ed-36ab-4625-bf7e-bcdecb30e35a") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 04:12:06 crc kubenswrapper[5072]: I0228 04:12:06.658361 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:12:06 crc kubenswrapper[5072]: I0228 04:12:06.658401 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:12:06 crc kubenswrapper[5072]: I0228 04:12:06.658445 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:06 crc kubenswrapper[5072]: I0228 04:12:06.658406 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:06 crc kubenswrapper[5072]: E0228 04:12:06.658507 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:12:06 crc kubenswrapper[5072]: E0228 04:12:06.658670 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:12:06 crc kubenswrapper[5072]: E0228 04:12:06.658725 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:12:06 crc kubenswrapper[5072]: E0228 04:12:06.658919 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:12:07 crc kubenswrapper[5072]: I0228 04:12:07.814833 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8pz98_ae699423-376d-4342-bf44-7d70f68fadd1/kube-multus/0.log" Feb 28 04:12:07 crc kubenswrapper[5072]: I0228 04:12:07.815191 5072 generic.go:334] "Generic (PLEG): container finished" podID="ae699423-376d-4342-bf44-7d70f68fadd1" containerID="b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7" exitCode=1 Feb 28 04:12:07 crc kubenswrapper[5072]: I0228 04:12:07.815226 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8pz98" event={"ID":"ae699423-376d-4342-bf44-7d70f68fadd1","Type":"ContainerDied","Data":"b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7"} Feb 28 04:12:07 crc kubenswrapper[5072]: I0228 04:12:07.815636 5072 scope.go:117] "RemoveContainer" containerID="b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7" Feb 28 04:12:07 crc kubenswrapper[5072]: I0228 04:12:07.827018 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:07Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:07 crc kubenswrapper[5072]: I0228 04:12:07.845415 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a35f15d0cc51a86876065ad4db3f6885f6082c0fcbb233219a133853081660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81a35f15d0cc51a86876065ad4db3f6885f6082c0fcbb233219a133853081660\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:11:46Z\\\",\\\"message\\\":\\\"/externalversions/factory.go:140\\\\nI0228 04:11:46.731745 7332 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:11:46.732141 7332 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:11:46.732157 7332 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:11:46.733751 7332 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0228 04:11:46.733851 7332 factory.go:656] Stopping watch factory\\\\nI0228 04:11:46.734687 7332 handler.go:208] Removed *v1.Node event handler 2\\\\nI0228 04:11:46.742622 7332 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0228 04:11:46.742708 7332 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0228 04:11:46.742785 7332 ovnkube.go:599] Stopped ovnkube\\\\nI0228 04:11:46.742827 7332 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0228 04:11:46.742933 7332 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kfpqp_openshift-ovn-kubernetes(043491df-2577-47f6-9a5b-03fecada16ce)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:07Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:07 crc kubenswrapper[5072]: I0228 04:12:07.857109 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0f43af6-3fa2-43c2-bcda-a20612fa4909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad6bb01bb4e8384cf21f3c9685f05cef09f2ff98ba02254763c92b9caba75b3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68fc83cc9ead75b24b42db57b3c2303adc3fcc02da67740a33a8c6b9d300871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7g5vc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:07Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:07 crc kubenswrapper[5072]: I0228 04:12:07.869836 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"efa3983e-a837-4c8a-9561-2f93a293272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34724f49f9d38bf481365aff1b0139a09f926c871e1a6840cfb6fb5d99358448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1950822dc64ce3c84b441fc6dc100ebf4d9669049c08f8c0ef21e6eb30f136bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0228 04:09:42.159193 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0228 04:09:42.161351 1 observer_polling.go:159] Starting file observer\\\\nI0228 04:09:42.222416 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0228 04:09:42.228824 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0228 04:10:07.970624 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0228 04:10:07.970713 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:10:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c229773678d8bee8898338eab595e8a47dba68ae424e54532ffc8eab7f077005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c20aeec9468a4876f5db381201080c44af00d4d5f79f73541faec3a8c7e5dd7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3fdbc0636689b1c505ce29b494474b17f8d7b1a1c40ae4a7df22a4e0e43bfca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:07Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:07 crc kubenswrapper[5072]: I0228 04:12:07.884089 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:07Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:07 crc kubenswrapper[5072]: I0228 04:12:07.897345 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:07Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:07 crc kubenswrapper[5072]: I0228 04:12:07.907776 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:07Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:07 crc kubenswrapper[5072]: I0228 04:12:07.920899 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8c2e2523bd296f73003e740d2868163a7df4734c10be2c86fc1e2aa62c8352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:07Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:07 crc kubenswrapper[5072]: I0228 04:12:07.929249 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sbbcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad3ce336eebfb16a74b9fe4cfaba8f000d957b0174c41537a464fb9c35374b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vlnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sbbcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:07Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:07 crc kubenswrapper[5072]: I0228 04:12:07.948943 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"152bf264-e4f6-47a1-818d-291cf135c91d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8147830d1b3f4a182599f91c8c22e3a576b0eb490abb717fdb3f5d8935bed72d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81de0063ef550336022938bbee3deee5c85bd71d0fe56ebfb0cfce30359bebc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1ebfc293bb64884104f802401ae8cbd269c8237c7b152b1fb1832f631c8f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633c33aa1dff5f7df880dc47e4a688c1dd56705f9b3eda53cb1e830e91cfa791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7114e25f3e3e183fca6beb9468507c6d3276500896964a4446b4a00015c747d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:07Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:07 crc kubenswrapper[5072]: I0228 04:12:07.962554 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:07Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:07 crc kubenswrapper[5072]: I0228 04:12:07.975299 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:07Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:07 crc kubenswrapper[5072]: I0228 04:12:07.985381 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:07Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:07 crc kubenswrapper[5072]: I0228 04:12:07.993367 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:07Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.003512 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:12:06Z\\\",\\\"message\\\":\\\"2026-02-28T04:11:21+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9a464ee9-2d2e-4b69-a098-79af90c450cf\\\\n2026-02-28T04:11:21+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9a464ee9-2d2e-4b69-a098-79af90c450cf to /host/opt/cni/bin/\\\\n2026-02-28T04:11:21Z [verbose] multus-daemon started\\\\n2026-02-28T04:11:21Z [verbose] Readiness Indicator file check\\\\n2026-02-28T04:12:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.012722 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10babdb-f877-4b8b-be76-024bf945583c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8fcf3c84cc8d43452422396d9f53c179f0264e124ebbfff139954a568b40532\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55beeaa697034d33a24c08794ac6fa9c624f2af676d99123431005b3cad00a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://577dc47d1b2ce12222648c5ba704654974f41d842e674ffd3650d1e5b44dafc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.022912 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.031406 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95gbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"109581ed-36ab-4625-bf7e-bcdecb30e35a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95gbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.393437 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:08 crc kubenswrapper[5072]: E0228 04:12:08.393579 5072 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 04:12:08 crc kubenswrapper[5072]: E0228 04:12:08.393635 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 04:13:12.3936188 +0000 UTC m=+214.388348992 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.493784 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.493884 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.493912 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.493933 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:12:08 crc kubenswrapper[5072]: E0228 04:12:08.494028 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:12.493998182 +0000 UTC m=+214.488728384 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:12:08 crc kubenswrapper[5072]: E0228 04:12:08.494072 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 04:12:08 crc kubenswrapper[5072]: E0228 04:12:08.494039 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 04:12:08 crc kubenswrapper[5072]: E0228 04:12:08.494086 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 04:12:08 crc kubenswrapper[5072]: E0228 04:12:08.494118 5072 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 04:12:08 crc kubenswrapper[5072]: E0228 04:12:08.494123 5072 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:12:08 crc kubenswrapper[5072]: E0228 04:12:08.494137 5072 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:12:08 crc kubenswrapper[5072]: E0228 04:12:08.494167 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-28 04:13:12.494154079 +0000 UTC m=+214.488884271 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:12:08 crc kubenswrapper[5072]: E0228 04:12:08.494156 5072 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 04:12:08 crc kubenswrapper[5072]: E0228 04:12:08.494187 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-28 04:13:12.49417474 +0000 UTC m=+214.488905062 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 04:12:08 crc kubenswrapper[5072]: E0228 04:12:08.494332 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 04:13:12.494297245 +0000 UTC m=+214.489027497 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.658464 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.658539 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.658557 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.658550 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:12:08 crc kubenswrapper[5072]: E0228 04:12:08.658665 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:12:08 crc kubenswrapper[5072]: E0228 04:12:08.658793 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:12:08 crc kubenswrapper[5072]: E0228 04:12:08.658943 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:12:08 crc kubenswrapper[5072]: E0228 04:12:08.658990 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.669299 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.670999 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.686271 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a35f15d0cc51a86876065ad4db3f6885f6082c0fcbb233219a133853081660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81a35f15d0cc51a86876065ad4db3f6885f6082c0fcbb233219a133853081660\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:11:46Z\\\",\\\"message\\\":\\\"/externalversions/factory.go:140\\\\nI0228 04:11:46.731745 7332 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:11:46.732141 7332 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:11:46.732157 7332 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:11:46.733751 7332 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0228 04:11:46.733851 7332 factory.go:656] Stopping watch factory\\\\nI0228 04:11:46.734687 7332 handler.go:208] Removed *v1.Node event handler 2\\\\nI0228 04:11:46.742622 7332 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0228 04:11:46.742708 7332 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0228 04:11:46.742785 7332 ovnkube.go:599] Stopped ovnkube\\\\nI0228 04:11:46.742827 7332 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0228 04:11:46.742933 7332 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kfpqp_openshift-ovn-kubernetes(043491df-2577-47f6-9a5b-03fecada16ce)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.696152 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sbbcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad3ce336eebfb16a74b9fe4cfaba8f000d957b0174c41537a464fb9c35374b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vlnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sbbcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.706964 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0f43af6-3fa2-43c2-bcda-a20612fa4909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad6bb01bb4e8384cf21f3c9685f05cef09f2ff98ba02254763c92b9caba75b3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68fc83cc9ead75b24b42db57b3c2303adc3fcc02da67740a33a8c6b9d300871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7g5vc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.719544 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"efa3983e-a837-4c8a-9561-2f93a293272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34724f49f9d38bf481365aff1b0139a09f926c871e1a6840cfb6fb5d99358448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1950822dc64ce3c84b441fc6dc100ebf4d9669049c08f8c0ef21e6eb30f136bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0228 04:09:42.159193 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0228 04:09:42.161351 1 observer_polling.go:159] Starting file observer\\\\nI0228 04:09:42.222416 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0228 04:09:42.228824 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0228 04:10:07.970624 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0228 04:10:07.970713 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:10:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c229773678d8bee8898338eab595e8a47dba68ae424e54532ffc8eab7f077005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c20aeec9468a4876f5db381201080c44af00d4d5f79f73541faec3a8c7e5dd7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3fdbc0636689b1c505ce29b494474b17f8d7b1a1c40ae4a7df22a4e0e43bfca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.731198 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.745329 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: E0228 04:12:08.745378 5072 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.756179 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.768080 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8c2e2523bd296f73003e740d2868163a7df4734c10be2c86fc1e2aa62c8352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.778955 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:12:06Z\\\",\\\"message\\\":\\\"2026-02-28T04:11:21+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9a464ee9-2d2e-4b69-a098-79af90c450cf\\\\n2026-02-28T04:11:21+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9a464ee9-2d2e-4b69-a098-79af90c450cf to /host/opt/cni/bin/\\\\n2026-02-28T04:11:21Z [verbose] multus-daemon started\\\\n2026-02-28T04:11:21Z [verbose] Readiness Indicator file check\\\\n2026-02-28T04:12:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.799450 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"152bf264-e4f6-47a1-818d-291cf135c91d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8147830d1b3f4a182599f91c8c22e3a576b0eb490abb717fdb3f5d8935bed72d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81de0063ef550336022938bbee3deee5c85bd71d0fe56ebfb0cfce30359bebc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1ebfc293bb64884104f802401ae8cbd269c8237c7b152b1fb1832f631c8f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633c33aa1dff5f7df880dc47e4a688c1dd56705f9b3eda53cb1e830e91cfa791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7114e25f3e3e183fca6beb9468507c6d3276500896964a4446b4a00015c747d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.813164 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.820567 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8pz98_ae699423-376d-4342-bf44-7d70f68fadd1/kube-multus/0.log" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.820669 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8pz98" event={"ID":"ae699423-376d-4342-bf44-7d70f68fadd1","Type":"ContainerStarted","Data":"f76011dc7c4eafa5461342a10320cb4acfd8b21eeb9293364d11c7a30744e0aa"} Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.828279 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.837610 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.845820 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.854783 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10babdb-f877-4b8b-be76-024bf945583c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8fcf3c84cc8d43452422396d9f53c179f0264e124ebbfff139954a568b40532\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55beeaa697034d33a24c08794ac6fa9c624f2af676d99123431005b3cad00a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://577dc47d1b2ce12222648c5ba704654974f41d842e674ffd3650d1e5b44dafc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.865378 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.874818 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95gbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"109581ed-36ab-4625-bf7e-bcdecb30e35a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95gbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.891397 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"152bf264-e4f6-47a1-818d-291cf135c91d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8147830d1b3f4a182599f91c8c22e3a576b0eb490abb717fdb3f5d8935bed72d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81de0063ef550336022938bbee3deee5c85bd71d0fe56ebfb0cfce30359bebc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1ebfc293bb64884104f802401ae8cbd269c8237c7b152b1fb1832f631c8f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633c33aa1dff5f7df880dc47e4a688c1dd56705f9b3eda53cb1e830e91cfa791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7114e25f3e3e183fca6beb9468507c6d3276500896964a4446b4a00015c747d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.902099 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.912871 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.924427 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.933460 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.944297 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76011dc7c4eafa5461342a10320cb4acfd8b21eeb9293364d11c7a30744e0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:12:06Z\\\",\\\"message\\\":\\\"2026-02-28T04:11:21+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9a464ee9-2d2e-4b69-a098-79af90c450cf\\\\n2026-02-28T04:11:21+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9a464ee9-2d2e-4b69-a098-79af90c450cf to /host/opt/cni/bin/\\\\n2026-02-28T04:11:21Z [verbose] multus-daemon started\\\\n2026-02-28T04:11:21Z [verbose] Readiness Indicator file check\\\\n2026-02-28T04:12:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.952705 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db297d46-ddc0-49eb-893f-036519c8c9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7779f536e33b45272502b7cdffb80567981b7ea7bb007f18e01eef6689b64a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://740a5600b9eed7a432dcc447c5a49b2a454afce62216d109e2835696c1adee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://740a5600b9eed7a432dcc447c5a49b2a454afce62216d109e2835696c1adee56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.962400 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10babdb-f877-4b8b-be76-024bf945583c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8fcf3c84cc8d43452422396d9f53c179f0264e124ebbfff139954a568b40532\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55beeaa697034d33a24c08794ac6fa9c624f2af676d99123431005b3cad00a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://577dc47d1b2ce12222648c5ba704654974f41d842e674ffd3650d1e5b44dafc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.972725 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.981277 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95gbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"109581ed-36ab-4625-bf7e-bcdecb30e35a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95gbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:08 crc kubenswrapper[5072]: I0228 04:12:08.990683 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:08Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:09 crc kubenswrapper[5072]: I0228 04:12:09.006080 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a35f15d0cc51a86876065ad4db3f6885f6082c0fcbb233219a133853081660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81a35f15d0cc51a86876065ad4db3f6885f6082c0fcbb233219a133853081660\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:11:46Z\\\",\\\"message\\\":\\\"/externalversions/factory.go:140\\\\nI0228 04:11:46.731745 7332 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:11:46.732141 7332 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:11:46.732157 7332 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:11:46.733751 7332 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0228 04:11:46.733851 7332 factory.go:656] Stopping watch factory\\\\nI0228 04:11:46.734687 7332 handler.go:208] Removed *v1.Node event handler 2\\\\nI0228 04:11:46.742622 7332 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0228 04:11:46.742708 7332 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0228 04:11:46.742785 7332 ovnkube.go:599] Stopped ovnkube\\\\nI0228 04:11:46.742827 7332 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0228 04:11:46.742933 7332 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kfpqp_openshift-ovn-kubernetes(043491df-2577-47f6-9a5b-03fecada16ce)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:09Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:09 crc kubenswrapper[5072]: I0228 04:12:09.015462 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"efa3983e-a837-4c8a-9561-2f93a293272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34724f49f9d38bf481365aff1b0139a09f926c871e1a6840cfb6fb5d99358448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1950822dc64ce3c84b441fc6dc100ebf4d9669049c08f8c0ef21e6eb30f136bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0228 04:09:42.159193 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0228 04:09:42.161351 1 observer_polling.go:159] Starting file observer\\\\nI0228 04:09:42.222416 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0228 04:09:42.228824 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0228 04:10:07.970624 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0228 04:10:07.970713 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:10:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c229773678d8bee8898338eab595e8a47dba68ae424e54532ffc8eab7f077005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c20aeec9468a4876f5db381201080c44af00d4d5f79f73541faec3a8c7e5dd7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3fdbc0636689b1c505ce29b494474b17f8d7b1a1c40ae4a7df22a4e0e43bfca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:09Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:09 crc kubenswrapper[5072]: I0228 04:12:09.025418 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:09Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:09 crc kubenswrapper[5072]: I0228 04:12:09.034964 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:09Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:09 crc kubenswrapper[5072]: I0228 04:12:09.043337 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:09Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:09 crc kubenswrapper[5072]: I0228 04:12:09.054717 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8c2e2523bd296f73003e740d2868163a7df4734c10be2c86fc1e2aa62c8352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:09Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:09 crc kubenswrapper[5072]: I0228 04:12:09.062497 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sbbcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad3ce336eebfb16a74b9fe4cfaba8f000d957b0174c41537a464fb9c35374b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vlnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sbbcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:09Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:09 crc kubenswrapper[5072]: I0228 04:12:09.073304 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0f43af6-3fa2-43c2-bcda-a20612fa4909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad6bb01bb4e8384cf21f3c9685f05cef09f2ff98ba02254763c92b9caba75b3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68fc83cc9ead75b24b42db57b3c2303adc3fcc02da67740a33a8c6b9d300871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7g5vc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:09Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:10 crc kubenswrapper[5072]: I0228 04:12:10.658809 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:12:10 crc kubenswrapper[5072]: I0228 04:12:10.658850 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:12:10 crc kubenswrapper[5072]: I0228 04:12:10.658924 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:10 crc kubenswrapper[5072]: I0228 04:12:10.659054 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:10 crc kubenswrapper[5072]: E0228 04:12:10.659065 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:12:10 crc kubenswrapper[5072]: E0228 04:12:10.659171 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:12:10 crc kubenswrapper[5072]: E0228 04:12:10.659276 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:12:10 crc kubenswrapper[5072]: E0228 04:12:10.659355 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:12:11 crc kubenswrapper[5072]: I0228 04:12:11.653128 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:12:11 crc kubenswrapper[5072]: I0228 04:12:11.653227 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:12:11 crc kubenswrapper[5072]: I0228 04:12:11.653252 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:12:11 crc kubenswrapper[5072]: I0228 04:12:11.653288 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:12:11 crc kubenswrapper[5072]: I0228 04:12:11.653310 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:12:11Z","lastTransitionTime":"2026-02-28T04:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:12:11 crc kubenswrapper[5072]: E0228 04:12:11.676094 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:11Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:11 crc kubenswrapper[5072]: I0228 04:12:11.680812 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:12:11 crc kubenswrapper[5072]: I0228 04:12:11.680885 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:12:11 crc kubenswrapper[5072]: I0228 04:12:11.680905 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:12:11 crc kubenswrapper[5072]: I0228 04:12:11.680936 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:12:11 crc kubenswrapper[5072]: I0228 04:12:11.680955 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:12:11Z","lastTransitionTime":"2026-02-28T04:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:12:11 crc kubenswrapper[5072]: E0228 04:12:11.699536 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:11Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:11 crc kubenswrapper[5072]: I0228 04:12:11.704755 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:12:11 crc kubenswrapper[5072]: I0228 04:12:11.704824 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:12:11 crc kubenswrapper[5072]: I0228 04:12:11.704843 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:12:11 crc kubenswrapper[5072]: I0228 04:12:11.704871 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:12:11 crc kubenswrapper[5072]: I0228 04:12:11.704895 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:12:11Z","lastTransitionTime":"2026-02-28T04:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:12:11 crc kubenswrapper[5072]: E0228 04:12:11.730396 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:11Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:11 crc kubenswrapper[5072]: I0228 04:12:11.737311 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:12:11 crc kubenswrapper[5072]: I0228 04:12:11.737361 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:12:11 crc kubenswrapper[5072]: I0228 04:12:11.737377 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:12:11 crc kubenswrapper[5072]: I0228 04:12:11.737402 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:12:11 crc kubenswrapper[5072]: I0228 04:12:11.737419 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:12:11Z","lastTransitionTime":"2026-02-28T04:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:12:11 crc kubenswrapper[5072]: E0228 04:12:11.757428 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:11Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:11 crc kubenswrapper[5072]: I0228 04:12:11.762868 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:12:11 crc kubenswrapper[5072]: I0228 04:12:11.762954 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:12:11 crc kubenswrapper[5072]: I0228 04:12:11.762981 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:12:11 crc kubenswrapper[5072]: I0228 04:12:11.763016 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:12:11 crc kubenswrapper[5072]: I0228 04:12:11.763032 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:12:11Z","lastTransitionTime":"2026-02-28T04:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:12:11 crc kubenswrapper[5072]: E0228 04:12:11.780131 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:11Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:11 crc kubenswrapper[5072]: E0228 04:12:11.780297 5072 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 28 04:12:12 crc kubenswrapper[5072]: I0228 04:12:12.659015 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:12 crc kubenswrapper[5072]: I0228 04:12:12.659078 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:12 crc kubenswrapper[5072]: I0228 04:12:12.659065 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:12:12 crc kubenswrapper[5072]: I0228 04:12:12.659026 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:12:12 crc kubenswrapper[5072]: E0228 04:12:12.659234 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:12:12 crc kubenswrapper[5072]: E0228 04:12:12.659376 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:12:12 crc kubenswrapper[5072]: E0228 04:12:12.659483 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:12:12 crc kubenswrapper[5072]: E0228 04:12:12.659716 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:12:13 crc kubenswrapper[5072]: E0228 04:12:13.747267 5072 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 04:12:14 crc kubenswrapper[5072]: I0228 04:12:14.658926 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:14 crc kubenswrapper[5072]: I0228 04:12:14.658976 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:12:14 crc kubenswrapper[5072]: I0228 04:12:14.658986 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:14 crc kubenswrapper[5072]: I0228 04:12:14.659061 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:12:14 crc kubenswrapper[5072]: E0228 04:12:14.659070 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:12:14 crc kubenswrapper[5072]: E0228 04:12:14.659162 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:12:14 crc kubenswrapper[5072]: E0228 04:12:14.659247 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:12:14 crc kubenswrapper[5072]: E0228 04:12:14.659338 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:12:16 crc kubenswrapper[5072]: I0228 04:12:16.657934 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:16 crc kubenswrapper[5072]: I0228 04:12:16.657983 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:12:16 crc kubenswrapper[5072]: I0228 04:12:16.658069 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:16 crc kubenswrapper[5072]: E0228 04:12:16.658190 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:12:16 crc kubenswrapper[5072]: I0228 04:12:16.658213 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:12:16 crc kubenswrapper[5072]: E0228 04:12:16.658327 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:12:16 crc kubenswrapper[5072]: E0228 04:12:16.658414 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:12:16 crc kubenswrapper[5072]: E0228 04:12:16.658470 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:12:17 crc kubenswrapper[5072]: I0228 04:12:17.659488 5072 scope.go:117] "RemoveContainer" containerID="81a35f15d0cc51a86876065ad4db3f6885f6082c0fcbb233219a133853081660" Feb 28 04:12:17 crc kubenswrapper[5072]: I0228 04:12:17.849319 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kfpqp_043491df-2577-47f6-9a5b-03fecada16ce/ovnkube-controller/2.log" Feb 28 04:12:17 crc kubenswrapper[5072]: I0228 04:12:17.851499 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" event={"ID":"043491df-2577-47f6-9a5b-03fecada16ce","Type":"ContainerStarted","Data":"edf6968db2420b0ba248f891852f08cbe7f9e463c9c1c480317d1e388f377057"} Feb 28 04:12:17 crc kubenswrapper[5072]: I0228 04:12:17.852522 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:12:17 crc kubenswrapper[5072]: I0228 04:12:17.863909 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:17Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:17 crc kubenswrapper[5072]: I0228 04:12:17.880535 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edf6968db2420b0ba248f891852f08cbe7f9e463c9c1c480317d1e388f377057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81a35f15d0cc51a86876065ad4db3f6885f6082c0fcbb233219a133853081660\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:11:46Z\\\",\\\"message\\\":\\\"/externalversions/factory.go:140\\\\nI0228 04:11:46.731745 7332 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:11:46.732141 7332 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:11:46.732157 7332 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:11:46.733751 7332 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0228 04:11:46.733851 7332 factory.go:656] Stopping watch factory\\\\nI0228 04:11:46.734687 7332 handler.go:208] Removed *v1.Node event handler 2\\\\nI0228 04:11:46.742622 7332 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0228 04:11:46.742708 7332 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0228 04:11:46.742785 7332 ovnkube.go:599] Stopped ovnkube\\\\nI0228 04:11:46.742827 7332 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0228 04:11:46.742933 7332 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:17Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:17 crc kubenswrapper[5072]: I0228 04:12:17.893166 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"efa3983e-a837-4c8a-9561-2f93a293272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34724f49f9d38bf481365aff1b0139a09f926c871e1a6840cfb6fb5d99358448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1950822dc64ce3c84b441fc6dc100ebf4d9669049c08f8c0ef21e6eb30f136bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0228 04:09:42.159193 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0228 04:09:42.161351 1 observer_polling.go:159] Starting file observer\\\\nI0228 04:09:42.222416 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0228 04:09:42.228824 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0228 04:10:07.970624 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0228 04:10:07.970713 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:10:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c229773678d8bee8898338eab595e8a47dba68ae424e54532ffc8eab7f077005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c20aeec9468a4876f5db381201080c44af00d4d5f79f73541faec3a8c7e5dd7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3fdbc0636689b1c505ce29b494474b17f8d7b1a1c40ae4a7df22a4e0e43bfca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:17Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:17 crc kubenswrapper[5072]: I0228 04:12:17.904605 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:17Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:17 crc kubenswrapper[5072]: I0228 04:12:17.916024 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:17Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:17 crc kubenswrapper[5072]: I0228 04:12:17.925136 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:17Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:17 crc kubenswrapper[5072]: I0228 04:12:17.938938 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8c2e2523bd296f73003e740d2868163a7df4734c10be2c86fc1e2aa62c8352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:17Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:17 crc kubenswrapper[5072]: I0228 04:12:17.947387 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sbbcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad3ce336eebfb16a74b9fe4cfaba8f000d957b0174c41537a464fb9c35374b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vlnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sbbcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:17Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:17 crc kubenswrapper[5072]: I0228 04:12:17.956630 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0f43af6-3fa2-43c2-bcda-a20612fa4909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad6bb01bb4e8384cf21f3c9685f05cef09f2ff98ba02254763c92b9caba75b3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68fc83cc9ead75b24b42db57b3c2303adc3fcc02da67740a33a8c6b9d300871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7g5vc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:17Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:17 crc kubenswrapper[5072]: I0228 04:12:17.974752 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"152bf264-e4f6-47a1-818d-291cf135c91d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8147830d1b3f4a182599f91c8c22e3a576b0eb490abb717fdb3f5d8935bed72d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81de0063ef550336022938bbee3deee5c85bd71d0fe56ebfb0cfce30359bebc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1ebfc293bb64884104f802401ae8cbd269c8237c7b152b1fb1832f631c8f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633c33aa1dff5f7df880dc47e4a688c1dd56705f9b3eda53cb1e830e91cfa791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7114e25f3e3e183fca6beb9468507c6d3276500896964a4446b4a00015c747d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:17Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:17 crc kubenswrapper[5072]: I0228 04:12:17.986872 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:17Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:17 crc kubenswrapper[5072]: I0228 04:12:17.999518 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:17Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.010013 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.019215 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.031264 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76011dc7c4eafa5461342a10320cb4acfd8b21eeb9293364d11c7a30744e0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:12:06Z\\\",\\\"message\\\":\\\"2026-02-28T04:11:21+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9a464ee9-2d2e-4b69-a098-79af90c450cf\\\\n2026-02-28T04:11:21+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9a464ee9-2d2e-4b69-a098-79af90c450cf to /host/opt/cni/bin/\\\\n2026-02-28T04:11:21Z [verbose] multus-daemon started\\\\n2026-02-28T04:11:21Z [verbose] Readiness Indicator file check\\\\n2026-02-28T04:12:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.041046 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db297d46-ddc0-49eb-893f-036519c8c9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7779f536e33b45272502b7cdffb80567981b7ea7bb007f18e01eef6689b64a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://740a5600b9eed7a432dcc447c5a49b2a454afce62216d109e2835696c1adee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://740a5600b9eed7a432dcc447c5a49b2a454afce62216d109e2835696c1adee56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.055838 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10babdb-f877-4b8b-be76-024bf945583c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8fcf3c84cc8d43452422396d9f53c179f0264e124ebbfff139954a568b40532\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55beeaa697034d33a24c08794ac6fa9c624f2af676d99123431005b3cad00a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://577dc47d1b2ce12222648c5ba704654974f41d842e674ffd3650d1e5b44dafc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.074925 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.085379 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95gbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"109581ed-36ab-4625-bf7e-bcdecb30e35a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95gbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.658555 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:18 crc kubenswrapper[5072]: E0228 04:12:18.658716 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.658919 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:18 crc kubenswrapper[5072]: E0228 04:12:18.658986 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.659156 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:12:18 crc kubenswrapper[5072]: E0228 04:12:18.659222 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.659450 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:12:18 crc kubenswrapper[5072]: E0228 04:12:18.659517 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.670470 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.689892 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edf6968db2420b0ba248f891852f08cbe7f9e463c9c1c480317d1e388f377057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81a35f15d0cc51a86876065ad4db3f6885f6082c0fcbb233219a133853081660\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:11:46Z\\\",\\\"message\\\":\\\"/externalversions/factory.go:140\\\\nI0228 04:11:46.731745 7332 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:11:46.732141 7332 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:11:46.732157 7332 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:11:46.733751 7332 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0228 04:11:46.733851 7332 factory.go:656] Stopping watch factory\\\\nI0228 04:11:46.734687 7332 handler.go:208] Removed *v1.Node event handler 2\\\\nI0228 04:11:46.742622 7332 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0228 04:11:46.742708 7332 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0228 04:11:46.742785 7332 ovnkube.go:599] Stopped ovnkube\\\\nI0228 04:11:46.742827 7332 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0228 04:11:46.742933 7332 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.701530 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8c2e2523bd296f73003e740d2868163a7df4734c10be2c86fc1e2aa62c8352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.712922 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sbbcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad3ce336eebfb16a74b9fe4cfaba8f000d957b0174c41537a464fb9c35374b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vlnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sbbcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.723349 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0f43af6-3fa2-43c2-bcda-a20612fa4909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad6bb01bb4e8384cf21f3c9685f05cef09f2ff98ba02254763c92b9caba75b3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68fc83cc9ead75b24b42db57b3c2303adc3fcc02da67740a33a8c6b9d300871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7g5vc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.734906 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"efa3983e-a837-4c8a-9561-2f93a293272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34724f49f9d38bf481365aff1b0139a09f926c871e1a6840cfb6fb5d99358448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1950822dc64ce3c84b441fc6dc100ebf4d9669049c08f8c0ef21e6eb30f136bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0228 04:09:42.159193 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0228 04:09:42.161351 1 observer_polling.go:159] Starting file observer\\\\nI0228 04:09:42.222416 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0228 04:09:42.228824 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0228 04:10:07.970624 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0228 04:10:07.970713 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:10:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c229773678d8bee8898338eab595e8a47dba68ae424e54532ffc8eab7f077005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c20aeec9468a4876f5db381201080c44af00d4d5f79f73541faec3a8c7e5dd7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3fdbc0636689b1c505ce29b494474b17f8d7b1a1c40ae4a7df22a4e0e43bfca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.746299 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: E0228 04:12:18.747668 5072 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.759600 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.769474 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.777548 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.788106 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76011dc7c4eafa5461342a10320cb4acfd8b21eeb9293364d11c7a30744e0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:12:06Z\\\",\\\"message\\\":\\\"2026-02-28T04:11:21+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9a464ee9-2d2e-4b69-a098-79af90c450cf\\\\n2026-02-28T04:11:21+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9a464ee9-2d2e-4b69-a098-79af90c450cf to /host/opt/cni/bin/\\\\n2026-02-28T04:11:21Z [verbose] multus-daemon started\\\\n2026-02-28T04:11:21Z [verbose] Readiness Indicator file check\\\\n2026-02-28T04:12:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.805326 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"152bf264-e4f6-47a1-818d-291cf135c91d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8147830d1b3f4a182599f91c8c22e3a576b0eb490abb717fdb3f5d8935bed72d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81de0063ef550336022938bbee3deee5c85bd71d0fe56ebfb0cfce30359bebc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1ebfc293bb64884104f802401ae8cbd269c8237c7b152b1fb1832f631c8f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633c33aa1dff5f7df880dc47e4a688c1dd56705f9b3eda53cb1e830e91cfa791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7114e25f3e3e183fca6beb9468507c6d3276500896964a4446b4a00015c747d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.818310 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.830214 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.840393 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.849090 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db297d46-ddc0-49eb-893f-036519c8c9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7779f536e33b45272502b7cdffb80567981b7ea7bb007f18e01eef6689b64a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://740a5600b9eed7a432dcc447c5a49b2a454afce62216d109e2835696c1adee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://740a5600b9eed7a432dcc447c5a49b2a454afce62216d109e2835696c1adee56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.857059 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kfpqp_043491df-2577-47f6-9a5b-03fecada16ce/ovnkube-controller/3.log" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.857599 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kfpqp_043491df-2577-47f6-9a5b-03fecada16ce/ovnkube-controller/2.log" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.859444 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10babdb-f877-4b8b-be76-024bf945583c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8fcf3c84cc8d43452422396d9f53c179f0264e124ebbfff139954a568b40532\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55beeaa697034d33a24c08794ac6fa9c624f2af676d99123431005b3cad00a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://577dc47d1b2ce12222648c5ba704654974f41d842e674ffd3650d1e5b44dafc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.861446 5072 generic.go:334] "Generic (PLEG): container finished" podID="043491df-2577-47f6-9a5b-03fecada16ce" containerID="edf6968db2420b0ba248f891852f08cbe7f9e463c9c1c480317d1e388f377057" exitCode=1 Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.861485 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" event={"ID":"043491df-2577-47f6-9a5b-03fecada16ce","Type":"ContainerDied","Data":"edf6968db2420b0ba248f891852f08cbe7f9e463c9c1c480317d1e388f377057"} Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.861520 5072 scope.go:117] "RemoveContainer" containerID="81a35f15d0cc51a86876065ad4db3f6885f6082c0fcbb233219a133853081660" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.862262 5072 scope.go:117] "RemoveContainer" containerID="edf6968db2420b0ba248f891852f08cbe7f9e463c9c1c480317d1e388f377057" Feb 28 04:12:18 crc kubenswrapper[5072]: E0228 04:12:18.862429 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kfpqp_openshift-ovn-kubernetes(043491df-2577-47f6-9a5b-03fecada16ce)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" podUID="043491df-2577-47f6-9a5b-03fecada16ce" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.873627 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.884054 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95gbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"109581ed-36ab-4625-bf7e-bcdecb30e35a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95gbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.896902 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db297d46-ddc0-49eb-893f-036519c8c9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7779f536e33b45272502b7cdffb80567981b7ea7bb007f18e01eef6689b64a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://740a5600b9eed7a432dcc447c5a49b2a454afce62216d109e2835696c1adee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://740a5600b9eed7a432dcc447c5a49b2a454afce62216d109e2835696c1adee56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.908945 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10babdb-f877-4b8b-be76-024bf945583c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8fcf3c84cc8d43452422396d9f53c179f0264e124ebbfff139954a568b40532\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55beeaa697034d33a24c08794ac6fa9c624f2af676d99123431005b3cad00a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://577dc47d1b2ce12222648c5ba704654974f41d842e674ffd3650d1e5b44dafc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.923219 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.934353 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95gbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"109581ed-36ab-4625-bf7e-bcdecb30e35a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95gbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.944462 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.963306 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edf6968db2420b0ba248f891852f08cbe7f9e463c9c1c480317d1e388f377057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81a35f15d0cc51a86876065ad4db3f6885f6082c0fcbb233219a133853081660\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:11:46Z\\\",\\\"message\\\":\\\"/externalversions/factory.go:140\\\\nI0228 04:11:46.731745 7332 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:11:46.732141 7332 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:11:46.732157 7332 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:11:46.733751 7332 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0228 04:11:46.733851 7332 factory.go:656] Stopping watch factory\\\\nI0228 04:11:46.734687 7332 handler.go:208] Removed *v1.Node event handler 2\\\\nI0228 04:11:46.742622 7332 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0228 04:11:46.742708 7332 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0228 04:11:46.742785 7332 ovnkube.go:599] Stopped ovnkube\\\\nI0228 04:11:46.742827 7332 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0228 04:11:46.742933 7332 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edf6968db2420b0ba248f891852f08cbe7f9e463c9c1c480317d1e388f377057\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:12:18Z\\\",\\\"message\\\":\\\" reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 04:12:18.671026 7646 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 04:12:18.671236 7646 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0228 04:12:18.670798 7646 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:12:18.672289 7646 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0228 04:12:18.672382 7646 factory.go:656] Stopping watch factory\\\\nI0228 04:12:18.672409 7646 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0228 04:12:18.708593 7646 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0228 04:12:18.708677 7646 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0228 04:12:18.708791 7646 ovnkube.go:599] Stopped ovnkube\\\\nI0228 04:12:18.708829 7646 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0228 04:12:18.708930 7646 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.976203 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8c2e2523bd296f73003e740d2868163a7df4734c10be2c86fc1e2aa62c8352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.984565 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sbbcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad3ce336eebfb16a74b9fe4cfaba8f000d957b0174c41537a464fb9c35374b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vlnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sbbcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:18 crc kubenswrapper[5072]: I0228 04:12:18.995080 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0f43af6-3fa2-43c2-bcda-a20612fa4909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad6bb01bb4e8384cf21f3c9685f05cef09f2ff98ba02254763c92b9caba75b3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68fc83cc9ead75b24b42db57b3c2303adc3fcc02da67740a33a8c6b9d300871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7g5vc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:18Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:19 crc kubenswrapper[5072]: I0228 04:12:19.007786 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"efa3983e-a837-4c8a-9561-2f93a293272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34724f49f9d38bf481365aff1b0139a09f926c871e1a6840cfb6fb5d99358448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1950822dc64ce3c84b441fc6dc100ebf4d9669049c08f8c0ef21e6eb30f136bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0228 04:09:42.159193 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0228 04:09:42.161351 1 observer_polling.go:159] Starting file observer\\\\nI0228 04:09:42.222416 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0228 04:09:42.228824 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0228 04:10:07.970624 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0228 04:10:07.970713 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:10:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c229773678d8bee8898338eab595e8a47dba68ae424e54532ffc8eab7f077005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c20aeec9468a4876f5db381201080c44af00d4d5f79f73541faec3a8c7e5dd7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3fdbc0636689b1c505ce29b494474b17f8d7b1a1c40ae4a7df22a4e0e43bfca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:19 crc kubenswrapper[5072]: I0228 04:12:19.018809 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:19 crc kubenswrapper[5072]: I0228 04:12:19.030921 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:19 crc kubenswrapper[5072]: I0228 04:12:19.041692 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:19 crc kubenswrapper[5072]: I0228 04:12:19.051028 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:19 crc kubenswrapper[5072]: I0228 04:12:19.065019 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76011dc7c4eafa5461342a10320cb4acfd8b21eeb9293364d11c7a30744e0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:12:06Z\\\",\\\"message\\\":\\\"2026-02-28T04:11:21+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9a464ee9-2d2e-4b69-a098-79af90c450cf\\\\n2026-02-28T04:11:21+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9a464ee9-2d2e-4b69-a098-79af90c450cf to /host/opt/cni/bin/\\\\n2026-02-28T04:11:21Z [verbose] multus-daemon started\\\\n2026-02-28T04:11:21Z [verbose] Readiness Indicator file check\\\\n2026-02-28T04:12:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:19 crc kubenswrapper[5072]: I0228 04:12:19.081428 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"152bf264-e4f6-47a1-818d-291cf135c91d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8147830d1b3f4a182599f91c8c22e3a576b0eb490abb717fdb3f5d8935bed72d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81de0063ef550336022938bbee3deee5c85bd71d0fe56ebfb0cfce30359bebc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1ebfc293bb64884104f802401ae8cbd269c8237c7b152b1fb1832f631c8f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633c33aa1dff5f7df880dc47e4a688c1dd56705f9b3eda53cb1e830e91cfa791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7114e25f3e3e183fca6beb9468507c6d3276500896964a4446b4a00015c747d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:19 crc kubenswrapper[5072]: I0228 04:12:19.092599 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:19 crc kubenswrapper[5072]: I0228 04:12:19.105741 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:19 crc kubenswrapper[5072]: I0228 04:12:19.115987 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:19 crc kubenswrapper[5072]: I0228 04:12:19.866485 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kfpqp_043491df-2577-47f6-9a5b-03fecada16ce/ovnkube-controller/3.log" Feb 28 04:12:19 crc kubenswrapper[5072]: I0228 04:12:19.870253 5072 scope.go:117] "RemoveContainer" containerID="edf6968db2420b0ba248f891852f08cbe7f9e463c9c1c480317d1e388f377057" Feb 28 04:12:19 crc kubenswrapper[5072]: E0228 04:12:19.870415 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kfpqp_openshift-ovn-kubernetes(043491df-2577-47f6-9a5b-03fecada16ce)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" podUID="043491df-2577-47f6-9a5b-03fecada16ce" Feb 28 04:12:19 crc kubenswrapper[5072]: I0228 04:12:19.882082 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:19 crc kubenswrapper[5072]: I0228 04:12:19.895306 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:19 crc kubenswrapper[5072]: I0228 04:12:19.908220 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:19 crc kubenswrapper[5072]: I0228 04:12:19.921607 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8c2e2523bd296f73003e740d2868163a7df4734c10be2c86fc1e2aa62c8352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:19 crc kubenswrapper[5072]: I0228 04:12:19.930943 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sbbcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad3ce336eebfb16a74b9fe4cfaba8f000d957b0174c41537a464fb9c35374b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vlnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sbbcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:19 crc kubenswrapper[5072]: I0228 04:12:19.940155 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0f43af6-3fa2-43c2-bcda-a20612fa4909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad6bb01bb4e8384cf21f3c9685f05cef09f2ff98ba02254763c92b9caba75b3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68fc83cc9ead75b24b42db57b3c2303adc3fcc02da67740a33a8c6b9d300871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7g5vc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:19 crc kubenswrapper[5072]: I0228 04:12:19.950780 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"efa3983e-a837-4c8a-9561-2f93a293272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34724f49f9d38bf481365aff1b0139a09f926c871e1a6840cfb6fb5d99358448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1950822dc64ce3c84b441fc6dc100ebf4d9669049c08f8c0ef21e6eb30f136bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0228 04:09:42.159193 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0228 04:09:42.161351 1 observer_polling.go:159] Starting file observer\\\\nI0228 04:09:42.222416 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0228 04:09:42.228824 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0228 04:10:07.970624 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0228 04:10:07.970713 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:10:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c229773678d8bee8898338eab595e8a47dba68ae424e54532ffc8eab7f077005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c20aeec9468a4876f5db381201080c44af00d4d5f79f73541faec3a8c7e5dd7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3fdbc0636689b1c505ce29b494474b17f8d7b1a1c40ae4a7df22a4e0e43bfca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:19 crc kubenswrapper[5072]: I0228 04:12:19.962882 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:19 crc kubenswrapper[5072]: I0228 04:12:19.974507 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:19 crc kubenswrapper[5072]: I0228 04:12:19.985714 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:19 crc kubenswrapper[5072]: I0228 04:12:19.995886 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:19Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:20 crc kubenswrapper[5072]: I0228 04:12:20.006470 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76011dc7c4eafa5461342a10320cb4acfd8b21eeb9293364d11c7a30744e0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:12:06Z\\\",\\\"message\\\":\\\"2026-02-28T04:11:21+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9a464ee9-2d2e-4b69-a098-79af90c450cf\\\\n2026-02-28T04:11:21+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9a464ee9-2d2e-4b69-a098-79af90c450cf to /host/opt/cni/bin/\\\\n2026-02-28T04:11:21Z [verbose] multus-daemon started\\\\n2026-02-28T04:11:21Z [verbose] Readiness Indicator file check\\\\n2026-02-28T04:12:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:20 crc kubenswrapper[5072]: I0228 04:12:20.022895 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"152bf264-e4f6-47a1-818d-291cf135c91d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8147830d1b3f4a182599f91c8c22e3a576b0eb490abb717fdb3f5d8935bed72d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81de0063ef550336022938bbee3deee5c85bd71d0fe56ebfb0cfce30359bebc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1ebfc293bb64884104f802401ae8cbd269c8237c7b152b1fb1832f631c8f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633c33aa1dff5f7df880dc47e4a688c1dd56705f9b3eda53cb1e830e91cfa791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7114e25f3e3e183fca6beb9468507c6d3276500896964a4446b4a00015c747d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:20 crc kubenswrapper[5072]: I0228 04:12:20.032429 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10babdb-f877-4b8b-be76-024bf945583c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8fcf3c84cc8d43452422396d9f53c179f0264e124ebbfff139954a568b40532\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55beeaa697034d33a24c08794ac6fa9c624f2af676d99123431005b3cad00a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://577dc47d1b2ce12222648c5ba704654974f41d842e674ffd3650d1e5b44dafc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:20 crc kubenswrapper[5072]: I0228 04:12:20.043889 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:20 crc kubenswrapper[5072]: I0228 04:12:20.053018 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95gbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"109581ed-36ab-4625-bf7e-bcdecb30e35a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95gbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:20 crc kubenswrapper[5072]: I0228 04:12:20.063159 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db297d46-ddc0-49eb-893f-036519c8c9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7779f536e33b45272502b7cdffb80567981b7ea7bb007f18e01eef6689b64a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://740a5600b9eed7a432dcc447c5a49b2a454afce62216d109e2835696c1adee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://740a5600b9eed7a432dcc447c5a49b2a454afce62216d109e2835696c1adee56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:20 crc kubenswrapper[5072]: I0228 04:12:20.079253 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edf6968db2420b0ba248f891852f08cbe7f9e463c9c1c480317d1e388f377057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edf6968db2420b0ba248f891852f08cbe7f9e463c9c1c480317d1e388f377057\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:12:18Z\\\",\\\"message\\\":\\\" reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 04:12:18.671026 7646 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 04:12:18.671236 7646 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0228 04:12:18.670798 7646 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:12:18.672289 7646 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0228 04:12:18.672382 7646 factory.go:656] Stopping watch factory\\\\nI0228 04:12:18.672409 7646 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0228 04:12:18.708593 7646 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0228 04:12:18.708677 7646 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0228 04:12:18.708791 7646 ovnkube.go:599] Stopped ovnkube\\\\nI0228 04:12:18.708829 7646 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0228 04:12:18.708930 7646 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:12:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kfpqp_openshift-ovn-kubernetes(043491df-2577-47f6-9a5b-03fecada16ce)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:20 crc kubenswrapper[5072]: I0228 04:12:20.088958 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:20Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:20 crc kubenswrapper[5072]: I0228 04:12:20.658576 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:20 crc kubenswrapper[5072]: I0228 04:12:20.658671 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:20 crc kubenswrapper[5072]: E0228 04:12:20.658730 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:12:20 crc kubenswrapper[5072]: I0228 04:12:20.658692 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:12:20 crc kubenswrapper[5072]: I0228 04:12:20.658668 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:12:20 crc kubenswrapper[5072]: E0228 04:12:20.658798 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:12:20 crc kubenswrapper[5072]: E0228 04:12:20.658876 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:12:20 crc kubenswrapper[5072]: E0228 04:12:20.658967 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:12:21 crc kubenswrapper[5072]: I0228 04:12:21.959291 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:12:21 crc kubenswrapper[5072]: I0228 04:12:21.959381 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:12:21 crc kubenswrapper[5072]: I0228 04:12:21.959403 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:12:21 crc kubenswrapper[5072]: I0228 04:12:21.959437 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:12:21 crc kubenswrapper[5072]: I0228 04:12:21.959461 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:12:21Z","lastTransitionTime":"2026-02-28T04:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:12:21 crc kubenswrapper[5072]: E0228 04:12:21.984890 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:21Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:21 crc kubenswrapper[5072]: I0228 04:12:21.990918 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:12:21 crc kubenswrapper[5072]: I0228 04:12:21.990999 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:12:21 crc kubenswrapper[5072]: I0228 04:12:21.991019 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:12:21 crc kubenswrapper[5072]: I0228 04:12:21.991049 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:12:21 crc kubenswrapper[5072]: I0228 04:12:21.991068 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:12:21Z","lastTransitionTime":"2026-02-28T04:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:12:22 crc kubenswrapper[5072]: E0228 04:12:22.013103 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:22Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:22 crc kubenswrapper[5072]: I0228 04:12:22.018164 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:12:22 crc kubenswrapper[5072]: I0228 04:12:22.018239 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:12:22 crc kubenswrapper[5072]: I0228 04:12:22.018260 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:12:22 crc kubenswrapper[5072]: I0228 04:12:22.018290 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:12:22 crc kubenswrapper[5072]: I0228 04:12:22.018310 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:12:22Z","lastTransitionTime":"2026-02-28T04:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:12:22 crc kubenswrapper[5072]: E0228 04:12:22.032814 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:22Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:22 crc kubenswrapper[5072]: I0228 04:12:22.037884 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:12:22 crc kubenswrapper[5072]: I0228 04:12:22.037939 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:12:22 crc kubenswrapper[5072]: I0228 04:12:22.037962 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:12:22 crc kubenswrapper[5072]: I0228 04:12:22.037993 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:12:22 crc kubenswrapper[5072]: I0228 04:12:22.038017 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:12:22Z","lastTransitionTime":"2026-02-28T04:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:12:22 crc kubenswrapper[5072]: E0228 04:12:22.063906 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:22Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:22 crc kubenswrapper[5072]: I0228 04:12:22.069348 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:12:22 crc kubenswrapper[5072]: I0228 04:12:22.069383 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:12:22 crc kubenswrapper[5072]: I0228 04:12:22.069392 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:12:22 crc kubenswrapper[5072]: I0228 04:12:22.069406 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:12:22 crc kubenswrapper[5072]: I0228 04:12:22.069416 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:12:22Z","lastTransitionTime":"2026-02-28T04:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:12:22 crc kubenswrapper[5072]: E0228 04:12:22.082104 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:22Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:22 crc kubenswrapper[5072]: E0228 04:12:22.082365 5072 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 28 04:12:22 crc kubenswrapper[5072]: I0228 04:12:22.658851 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:12:22 crc kubenswrapper[5072]: I0228 04:12:22.658914 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:22 crc kubenswrapper[5072]: I0228 04:12:22.658948 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:12:22 crc kubenswrapper[5072]: I0228 04:12:22.658997 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:22 crc kubenswrapper[5072]: E0228 04:12:22.658995 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:12:22 crc kubenswrapper[5072]: E0228 04:12:22.659230 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:12:22 crc kubenswrapper[5072]: E0228 04:12:22.659333 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:12:22 crc kubenswrapper[5072]: E0228 04:12:22.659536 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:12:23 crc kubenswrapper[5072]: E0228 04:12:23.749429 5072 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 04:12:24 crc kubenswrapper[5072]: I0228 04:12:24.658540 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:24 crc kubenswrapper[5072]: I0228 04:12:24.658599 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:12:24 crc kubenswrapper[5072]: E0228 04:12:24.658677 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:12:24 crc kubenswrapper[5072]: I0228 04:12:24.658554 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:24 crc kubenswrapper[5072]: I0228 04:12:24.658848 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:12:24 crc kubenswrapper[5072]: E0228 04:12:24.658840 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:12:24 crc kubenswrapper[5072]: E0228 04:12:24.658900 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:12:24 crc kubenswrapper[5072]: E0228 04:12:24.658941 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:12:26 crc kubenswrapper[5072]: I0228 04:12:26.658699 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:26 crc kubenswrapper[5072]: I0228 04:12:26.658747 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:26 crc kubenswrapper[5072]: I0228 04:12:26.658820 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:12:26 crc kubenswrapper[5072]: I0228 04:12:26.658762 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:12:26 crc kubenswrapper[5072]: E0228 04:12:26.658880 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:12:26 crc kubenswrapper[5072]: E0228 04:12:26.658978 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:12:26 crc kubenswrapper[5072]: E0228 04:12:26.659065 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:12:26 crc kubenswrapper[5072]: E0228 04:12:26.659169 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:12:28 crc kubenswrapper[5072]: I0228 04:12:28.658047 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:28 crc kubenswrapper[5072]: I0228 04:12:28.658057 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:28 crc kubenswrapper[5072]: I0228 04:12:28.658070 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:12:28 crc kubenswrapper[5072]: I0228 04:12:28.658210 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:12:28 crc kubenswrapper[5072]: E0228 04:12:28.658374 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:12:28 crc kubenswrapper[5072]: E0228 04:12:28.658593 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:12:28 crc kubenswrapper[5072]: E0228 04:12:28.658751 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:12:28 crc kubenswrapper[5072]: E0228 04:12:28.659052 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:12:28 crc kubenswrapper[5072]: I0228 04:12:28.670906 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a035bbab-1d8f-4120-aaf7-88984d936939\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5124a9f2335194528c90f49821e4d1fa57bb5d45d81be281ba7de68b6db8e503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxlst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lrpf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:28 crc kubenswrapper[5072]: I0228 04:12:28.686719 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"043491df-2577-47f6-9a5b-03fecada16ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edf6968db2420b0ba248f891852f08cbe7f9e463c9c1c480317d1e388f377057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edf6968db2420b0ba248f891852f08cbe7f9e463c9c1c480317d1e388f377057\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:12:18Z\\\",\\\"message\\\":\\\" reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 04:12:18.671026 7646 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 04:12:18.671236 7646 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0228 04:12:18.670798 7646 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 04:12:18.672289 7646 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0228 04:12:18.672382 7646 factory.go:656] Stopping watch factory\\\\nI0228 04:12:18.672409 7646 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0228 04:12:18.708593 7646 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0228 04:12:18.708677 7646 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0228 04:12:18.708791 7646 ovnkube.go:599] Stopped ovnkube\\\\nI0228 04:12:18.708829 7646 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0228 04:12:18.708930 7646 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:12:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kfpqp_openshift-ovn-kubernetes(043491df-2577-47f6-9a5b-03fecada16ce)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvpck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kfpqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:28 crc kubenswrapper[5072]: I0228 04:12:28.699708 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"efa3983e-a837-4c8a-9561-2f93a293272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34724f49f9d38bf481365aff1b0139a09f926c871e1a6840cfb6fb5d99358448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1950822dc64ce3c84b441fc6dc100ebf4d9669049c08f8c0ef21e6eb30f136bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:08Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0228 04:09:42.159193 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0228 04:09:42.161351 1 observer_polling.go:159] Starting file observer\\\\nI0228 04:09:42.222416 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0228 04:09:42.228824 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0228 04:10:07.970624 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0228 04:10:07.970713 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:10:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c229773678d8bee8898338eab595e8a47dba68ae424e54532ffc8eab7f077005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c20aeec9468a4876f5db381201080c44af00d4d5f79f73541faec3a8c7e5dd7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3fdbc0636689b1c505ce29b494474b17f8d7b1a1c40ae4a7df22a4e0e43bfca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:28 crc kubenswrapper[5072]: I0228 04:12:28.710553 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:28 crc kubenswrapper[5072]: I0228 04:12:28.723104 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd7503647e9f633a6cd5130e7da0d99b35e85f8534f66191d821f51193130b06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:28 crc kubenswrapper[5072]: I0228 04:12:28.734321 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05c307856d10684f54b580dadee66dc636fd5f62fdb464df6f4fa74aab88d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:28 crc kubenswrapper[5072]: I0228 04:12:28.749304 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44befe72-7499-4d23-a09b-3d715817c3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8c2e2523bd296f73003e740d2868163a7df4734c10be2c86fc1e2aa62c8352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b336e165d20d3c7e22d31ace167d92b6e75d8eca9545ddfa6e104de475fd9d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd36c4ed89bf18c43ad6a4180d2710d9f018fbe6d0e30b4ffea5f89ca4848075\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5caa6ffe540a6993f3187ff62530e7d094699e27228c092040929f3ad0060284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f402d82ea567fa0c318332c331c8a61c8ad5957094dd05de02a564528d5cb19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df36963f4ab44675c6d8e22a607b3aff51dde8f759ccd808e67747f9b8e1cf0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26629d9e0f54d496b7242e9b09edec08a341f0302496dfc74a3675ce5ea75fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwhsk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n6jpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:28 crc kubenswrapper[5072]: E0228 04:12:28.750066 5072 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 04:12:28 crc kubenswrapper[5072]: I0228 04:12:28.759503 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sbbcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bcdefe7-e804-46ac-ad0a-5e593614fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad3ce336eebfb16a74b9fe4cfaba8f000d957b0174c41537a464fb9c35374b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vlnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sbbcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:28 crc kubenswrapper[5072]: I0228 04:12:28.769627 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0f43af6-3fa2-43c2-bcda-a20612fa4909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad6bb01bb4e8384cf21f3c9685f05cef09f2ff98ba02254763c92b9caba75b3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b68fc83cc9ead75b24b42db57b3c2303adc3fcc02da67740a33a8c6b9d300871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cv2ch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7g5vc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:28 crc kubenswrapper[5072]: I0228 04:12:28.785550 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"152bf264-e4f6-47a1-818d-291cf135c91d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8147830d1b3f4a182599f91c8c22e3a576b0eb490abb717fdb3f5d8935bed72d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81de0063ef550336022938bbee3deee5c85bd71d0fe56ebfb0cfce30359bebc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1ebfc293bb64884104f802401ae8cbd269c8237c7b152b1fb1832f631c8f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633c33aa1dff5f7df880dc47e4a688c1dd56705f9b3eda53cb1e830e91cfa791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7114e25f3e3e183fca6beb9468507c6d3276500896964a4446b4a00015c747d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b327f8d3461368c6cf356ddc74172913e5bfd67a78dc8e77fe51aa29626dac6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afcc8189d7ea8ac2ac081534998424d6f6dc9807b0b8410287f0567a1111590e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382b5d5e370483cabc53ed278b7a6b68da76ae3eb17cc34be622c520d70c5a1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:28 crc kubenswrapper[5072]: I0228 04:12:28.797047 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef530fad-4e99-4682-a3b4-604c37c2b1a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T04:10:39Z\\\",\\\"message\\\":\\\"file observer\\\\nW0228 04:10:39.272143 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 04:10:39.272262 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 04:10:39.272935 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-229024387/tls.crt::/tmp/serving-cert-229024387/tls.key\\\\\\\"\\\\nI0228 04:10:39.545196 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 04:10:39.547100 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 04:10:39.547155 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 04:10:39.547212 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 04:10:39.547251 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 04:10:39.550977 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 04:10:39.551003 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 04:10:39.551012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 04:10:39.551016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 04:10:39.551019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 04:10:39.551022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 04:10:39.551193 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 04:10:39.553021 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:10:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:28 crc kubenswrapper[5072]: I0228 04:12:28.807767 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7970eef448348e8be9e480b9d6eaa6d657244f6aa775246cdc5ddf94c9b5d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f66d07e11a0e3cb995b56328c3e7d4a03e7af5ecce148d8cc4375024162fea00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:28 crc kubenswrapper[5072]: I0228 04:12:28.817731 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:28 crc kubenswrapper[5072]: I0228 04:12:28.825885 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxs2k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d35c1a3-39ec-4e00-9d3f-5d1934701d44\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a7a36d316719af6c0aa329e9d66adb367a371ccc653270bfe9f0d9befdc1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2gvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxs2k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:28 crc kubenswrapper[5072]: I0228 04:12:28.836110 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pz98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae699423-376d-4342-bf44-7d70f68fadd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76011dc7c4eafa5461342a10320cb4acfd8b21eeb9293364d11c7a30744e0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T04:12:06Z\\\",\\\"message\\\":\\\"2026-02-28T04:11:21+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9a464ee9-2d2e-4b69-a098-79af90c450cf\\\\n2026-02-28T04:11:21+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9a464ee9-2d2e-4b69-a098-79af90c450cf to /host/opt/cni/bin/\\\\n2026-02-28T04:11:21Z [verbose] multus-daemon started\\\\n2026-02-28T04:11:21Z [verbose] Readiness Indicator file check\\\\n2026-02-28T04:12:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T04:11:20Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2tcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pz98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:28 crc kubenswrapper[5072]: I0228 04:12:28.844739 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db297d46-ddc0-49eb-893f-036519c8c9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7779f536e33b45272502b7cdffb80567981b7ea7bb007f18e01eef6689b64a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://740a5600b9eed7a432dcc447c5a49b2a454afce62216d109e2835696c1adee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://740a5600b9eed7a432dcc447c5a49b2a454afce62216d109e2835696c1adee56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:28 crc kubenswrapper[5072]: I0228 04:12:28.854267 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10babdb-f877-4b8b-be76-024bf945583c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8fcf3c84cc8d43452422396d9f53c179f0264e124ebbfff139954a568b40532\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55beeaa697034d33a24c08794ac6fa9c624f2af676d99123431005b3cad00a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://577dc47d1b2ce12222648c5ba704654974f41d842e674ffd3650d1e5b44dafc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T04:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4c90a4dd5023e2ba1265a8863fa97a7d430fa2c47360f2a70c9331154d93c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T04:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T04:09:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:09:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:28 crc kubenswrapper[5072]: I0228 04:12:28.864793 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:28 crc kubenswrapper[5072]: I0228 04:12:28.874214 5072 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-95gbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"109581ed-36ab-4625-bf7e-bcdecb30e35a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T04:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvjpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T04:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-95gbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:28Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:30 crc kubenswrapper[5072]: I0228 04:12:30.658777 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:30 crc kubenswrapper[5072]: I0228 04:12:30.658797 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:30 crc kubenswrapper[5072]: E0228 04:12:30.658891 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:12:30 crc kubenswrapper[5072]: I0228 04:12:30.658971 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:12:30 crc kubenswrapper[5072]: E0228 04:12:30.659132 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:12:30 crc kubenswrapper[5072]: I0228 04:12:30.659161 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:12:30 crc kubenswrapper[5072]: E0228 04:12:30.659213 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:12:30 crc kubenswrapper[5072]: E0228 04:12:30.659258 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:12:31 crc kubenswrapper[5072]: I0228 04:12:31.659236 5072 scope.go:117] "RemoveContainer" containerID="edf6968db2420b0ba248f891852f08cbe7f9e463c9c1c480317d1e388f377057" Feb 28 04:12:31 crc kubenswrapper[5072]: E0228 04:12:31.659401 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kfpqp_openshift-ovn-kubernetes(043491df-2577-47f6-9a5b-03fecada16ce)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" podUID="043491df-2577-47f6-9a5b-03fecada16ce" Feb 28 04:12:32 crc kubenswrapper[5072]: I0228 04:12:32.115942 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:12:32 crc kubenswrapper[5072]: I0228 04:12:32.115985 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:12:32 crc kubenswrapper[5072]: I0228 04:12:32.115996 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:12:32 crc kubenswrapper[5072]: I0228 04:12:32.116012 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:12:32 crc kubenswrapper[5072]: I0228 04:12:32.116025 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:12:32Z","lastTransitionTime":"2026-02-28T04:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:12:32 crc kubenswrapper[5072]: E0228 04:12:32.128988 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:32 crc kubenswrapper[5072]: I0228 04:12:32.132801 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:12:32 crc kubenswrapper[5072]: I0228 04:12:32.132834 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:12:32 crc kubenswrapper[5072]: I0228 04:12:32.132846 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:12:32 crc kubenswrapper[5072]: I0228 04:12:32.132860 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:12:32 crc kubenswrapper[5072]: I0228 04:12:32.132873 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:12:32Z","lastTransitionTime":"2026-02-28T04:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:12:32 crc kubenswrapper[5072]: E0228 04:12:32.146511 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:32 crc kubenswrapper[5072]: I0228 04:12:32.150546 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:12:32 crc kubenswrapper[5072]: I0228 04:12:32.150580 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:12:32 crc kubenswrapper[5072]: I0228 04:12:32.150589 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:12:32 crc kubenswrapper[5072]: I0228 04:12:32.150605 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:12:32 crc kubenswrapper[5072]: I0228 04:12:32.150615 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:12:32Z","lastTransitionTime":"2026-02-28T04:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:12:32 crc kubenswrapper[5072]: E0228 04:12:32.163406 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:32 crc kubenswrapper[5072]: I0228 04:12:32.167109 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:12:32 crc kubenswrapper[5072]: I0228 04:12:32.167192 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:12:32 crc kubenswrapper[5072]: I0228 04:12:32.167216 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:12:32 crc kubenswrapper[5072]: I0228 04:12:32.167247 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:12:32 crc kubenswrapper[5072]: I0228 04:12:32.167270 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:12:32Z","lastTransitionTime":"2026-02-28T04:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:12:32 crc kubenswrapper[5072]: E0228 04:12:32.180581 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:32 crc kubenswrapper[5072]: I0228 04:12:32.184404 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:12:32 crc kubenswrapper[5072]: I0228 04:12:32.184443 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:12:32 crc kubenswrapper[5072]: I0228 04:12:32.184454 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:12:32 crc kubenswrapper[5072]: I0228 04:12:32.184472 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:12:32 crc kubenswrapper[5072]: I0228 04:12:32.184484 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:12:32Z","lastTransitionTime":"2026-02-28T04:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:12:32 crc kubenswrapper[5072]: E0228 04:12:32.195980 5072 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T04:12:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c99101c4-599a-4ac8-9800-c4679859c59e\\\",\\\"systemUUID\\\":\\\"05edca7b-62f1-4864-9cd6-627477cf26a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T04:12:32Z is after 2025-08-24T17:21:41Z" Feb 28 04:12:32 crc kubenswrapper[5072]: E0228 04:12:32.196128 5072 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 28 04:12:32 crc kubenswrapper[5072]: I0228 04:12:32.658122 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:12:32 crc kubenswrapper[5072]: E0228 04:12:32.658286 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:12:32 crc kubenswrapper[5072]: I0228 04:12:32.658408 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:32 crc kubenswrapper[5072]: E0228 04:12:32.658601 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:12:32 crc kubenswrapper[5072]: I0228 04:12:32.658691 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:32 crc kubenswrapper[5072]: I0228 04:12:32.658754 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:12:32 crc kubenswrapper[5072]: E0228 04:12:32.658767 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:12:32 crc kubenswrapper[5072]: E0228 04:12:32.658890 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:12:33 crc kubenswrapper[5072]: E0228 04:12:33.751324 5072 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 04:12:34 crc kubenswrapper[5072]: I0228 04:12:34.658656 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:12:34 crc kubenswrapper[5072]: I0228 04:12:34.659836 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:34 crc kubenswrapper[5072]: I0228 04:12:34.659061 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:12:34 crc kubenswrapper[5072]: E0228 04:12:34.662834 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:12:34 crc kubenswrapper[5072]: I0228 04:12:34.663144 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:34 crc kubenswrapper[5072]: E0228 04:12:34.663150 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:12:34 crc kubenswrapper[5072]: E0228 04:12:34.663243 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:12:34 crc kubenswrapper[5072]: E0228 04:12:34.663313 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:12:36 crc kubenswrapper[5072]: I0228 04:12:36.658564 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:36 crc kubenswrapper[5072]: I0228 04:12:36.658617 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:36 crc kubenswrapper[5072]: I0228 04:12:36.658691 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:12:36 crc kubenswrapper[5072]: E0228 04:12:36.658736 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:12:36 crc kubenswrapper[5072]: I0228 04:12:36.658795 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:12:36 crc kubenswrapper[5072]: E0228 04:12:36.658898 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:12:36 crc kubenswrapper[5072]: E0228 04:12:36.658984 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:12:36 crc kubenswrapper[5072]: E0228 04:12:36.659080 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:12:36 crc kubenswrapper[5072]: I0228 04:12:36.767430 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/109581ed-36ab-4625-bf7e-bcdecb30e35a-metrics-certs\") pod \"network-metrics-daemon-95gbg\" (UID: \"109581ed-36ab-4625-bf7e-bcdecb30e35a\") " pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:36 crc kubenswrapper[5072]: E0228 04:12:36.767678 5072 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 04:12:36 crc kubenswrapper[5072]: E0228 04:12:36.767789 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/109581ed-36ab-4625-bf7e-bcdecb30e35a-metrics-certs podName:109581ed-36ab-4625-bf7e-bcdecb30e35a nodeName:}" failed. No retries permitted until 2026-02-28 04:13:40.767760499 +0000 UTC m=+242.762490731 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/109581ed-36ab-4625-bf7e-bcdecb30e35a-metrics-certs") pod "network-metrics-daemon-95gbg" (UID: "109581ed-36ab-4625-bf7e-bcdecb30e35a") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 04:12:38 crc kubenswrapper[5072]: I0228 04:12:38.658768 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:38 crc kubenswrapper[5072]: I0228 04:12:38.658873 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:12:38 crc kubenswrapper[5072]: I0228 04:12:38.658978 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:38 crc kubenswrapper[5072]: E0228 04:12:38.659029 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:12:38 crc kubenswrapper[5072]: E0228 04:12:38.658881 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:12:38 crc kubenswrapper[5072]: E0228 04:12:38.659127 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:12:38 crc kubenswrapper[5072]: I0228 04:12:38.659183 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:12:38 crc kubenswrapper[5072]: E0228 04:12:38.659231 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:12:38 crc kubenswrapper[5072]: I0228 04:12:38.682689 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=30.682672226 podStartE2EDuration="30.682672226s" podCreationTimestamp="2026-02-28 04:12:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:12:38.681572682 +0000 UTC m=+180.676302884" watchObservedRunningTime="2026-02-28 04:12:38.682672226 +0000 UTC m=+180.677402418" Feb 28 04:12:38 crc kubenswrapper[5072]: I0228 04:12:38.696277 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=63.696259732 podStartE2EDuration="1m3.696259732s" podCreationTimestamp="2026-02-28 04:11:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:12:38.696082326 +0000 UTC m=+180.690812518" watchObservedRunningTime="2026-02-28 04:12:38.696259732 +0000 UTC m=+180.690989924" Feb 28 04:12:38 crc kubenswrapper[5072]: I0228 04:12:38.726951 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podStartSLOduration=135.726928741 podStartE2EDuration="2m15.726928741s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:12:38.726444916 +0000 UTC m=+180.721175118" watchObservedRunningTime="2026-02-28 04:12:38.726928741 +0000 UTC m=+180.721658933" Feb 28 04:12:38 crc kubenswrapper[5072]: E0228 04:12:38.751802 5072 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 04:12:38 crc kubenswrapper[5072]: I0228 04:12:38.775539 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-sbbcr" podStartSLOduration=135.775517179 podStartE2EDuration="2m15.775517179s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:12:38.762740457 +0000 UTC m=+180.757470669" watchObservedRunningTime="2026-02-28 04:12:38.775517179 +0000 UTC m=+180.770247371" Feb 28 04:12:38 crc kubenswrapper[5072]: I0228 04:12:38.775718 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7g5vc" podStartSLOduration=135.775711325 podStartE2EDuration="2m15.775711325s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:12:38.774664953 +0000 UTC m=+180.769395145" watchObservedRunningTime="2026-02-28 04:12:38.775711325 +0000 UTC m=+180.770441517" Feb 28 04:12:38 crc kubenswrapper[5072]: I0228 04:12:38.787279 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=58.787265448 podStartE2EDuration="58.787265448s" podCreationTimestamp="2026-02-28 04:11:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:12:38.787130314 +0000 UTC m=+180.781860506" watchObservedRunningTime="2026-02-28 04:12:38.787265448 +0000 UTC m=+180.781995640" Feb 28 04:12:38 crc kubenswrapper[5072]: I0228 04:12:38.890428 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-n6jpz" podStartSLOduration=135.890408747 podStartE2EDuration="2m15.890408747s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:12:38.877974016 +0000 UTC m=+180.872704218" watchObservedRunningTime="2026-02-28 04:12:38.890408747 +0000 UTC m=+180.885138939" Feb 28 04:12:38 crc kubenswrapper[5072]: I0228 04:12:38.914024 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8pz98" podStartSLOduration=135.914004989 podStartE2EDuration="2m15.914004989s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:12:38.891230442 +0000 UTC m=+180.885960654" watchObservedRunningTime="2026-02-28 04:12:38.914004989 +0000 UTC m=+180.908735181" Feb 28 04:12:38 crc kubenswrapper[5072]: I0228 04:12:38.929744 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=71.929728251 podStartE2EDuration="1m11.929728251s" podCreationTimestamp="2026-02-28 04:11:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:12:38.915512786 +0000 UTC m=+180.910242988" watchObservedRunningTime="2026-02-28 04:12:38.929728251 +0000 UTC m=+180.924458443" Feb 28 04:12:38 crc kubenswrapper[5072]: I0228 04:12:38.951175 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=86.951157117 podStartE2EDuration="1m26.951157117s" podCreationTimestamp="2026-02-28 04:11:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:12:38.929905546 +0000 UTC m=+180.924635738" watchObservedRunningTime="2026-02-28 04:12:38.951157117 +0000 UTC m=+180.945887309" Feb 28 04:12:38 crc kubenswrapper[5072]: I0228 04:12:38.972402 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-vxs2k" podStartSLOduration=135.972383276 podStartE2EDuration="2m15.972383276s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:12:38.971832869 +0000 UTC m=+180.966563061" watchObservedRunningTime="2026-02-28 04:12:38.972383276 +0000 UTC m=+180.967113458" Feb 28 04:12:40 crc kubenswrapper[5072]: I0228 04:12:40.658256 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:12:40 crc kubenswrapper[5072]: E0228 04:12:40.658416 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:12:40 crc kubenswrapper[5072]: I0228 04:12:40.658256 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:40 crc kubenswrapper[5072]: I0228 04:12:40.658291 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:12:40 crc kubenswrapper[5072]: E0228 04:12:40.658531 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:12:40 crc kubenswrapper[5072]: E0228 04:12:40.658695 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:12:40 crc kubenswrapper[5072]: I0228 04:12:40.659076 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:40 crc kubenswrapper[5072]: E0228 04:12:40.659249 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.360301 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.360364 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.360382 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.360418 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.360436 5072 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T04:12:42Z","lastTransitionTime":"2026-02-28T04:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.407951 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-6bbc4"] Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.408281 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6bbc4" Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.410659 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.412196 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.412934 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.413094 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.525054 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f843a395-6f96-42d2-ad5c-ba62c073da99-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6bbc4\" (UID: \"f843a395-6f96-42d2-ad5c-ba62c073da99\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6bbc4" Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.525101 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f843a395-6f96-42d2-ad5c-ba62c073da99-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6bbc4\" (UID: \"f843a395-6f96-42d2-ad5c-ba62c073da99\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6bbc4" Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.525122 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f843a395-6f96-42d2-ad5c-ba62c073da99-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6bbc4\" (UID: \"f843a395-6f96-42d2-ad5c-ba62c073da99\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6bbc4" Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.525235 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f843a395-6f96-42d2-ad5c-ba62c073da99-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6bbc4\" (UID: \"f843a395-6f96-42d2-ad5c-ba62c073da99\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6bbc4" Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.525282 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f843a395-6f96-42d2-ad5c-ba62c073da99-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6bbc4\" (UID: \"f843a395-6f96-42d2-ad5c-ba62c073da99\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6bbc4" Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.626766 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f843a395-6f96-42d2-ad5c-ba62c073da99-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6bbc4\" (UID: \"f843a395-6f96-42d2-ad5c-ba62c073da99\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6bbc4" Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.626809 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f843a395-6f96-42d2-ad5c-ba62c073da99-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6bbc4\" (UID: \"f843a395-6f96-42d2-ad5c-ba62c073da99\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6bbc4" Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.626836 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f843a395-6f96-42d2-ad5c-ba62c073da99-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6bbc4\" (UID: \"f843a395-6f96-42d2-ad5c-ba62c073da99\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6bbc4" Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.626857 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f843a395-6f96-42d2-ad5c-ba62c073da99-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6bbc4\" (UID: \"f843a395-6f96-42d2-ad5c-ba62c073da99\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6bbc4" Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.626871 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f843a395-6f96-42d2-ad5c-ba62c073da99-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6bbc4\" (UID: \"f843a395-6f96-42d2-ad5c-ba62c073da99\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6bbc4" Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.626932 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f843a395-6f96-42d2-ad5c-ba62c073da99-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6bbc4\" (UID: \"f843a395-6f96-42d2-ad5c-ba62c073da99\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6bbc4" Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.627011 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f843a395-6f96-42d2-ad5c-ba62c073da99-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6bbc4\" (UID: \"f843a395-6f96-42d2-ad5c-ba62c073da99\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6bbc4" Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.627566 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f843a395-6f96-42d2-ad5c-ba62c073da99-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6bbc4\" (UID: \"f843a395-6f96-42d2-ad5c-ba62c073da99\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6bbc4" Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.633252 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f843a395-6f96-42d2-ad5c-ba62c073da99-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6bbc4\" (UID: \"f843a395-6f96-42d2-ad5c-ba62c073da99\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6bbc4" Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.641495 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f843a395-6f96-42d2-ad5c-ba62c073da99-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6bbc4\" (UID: \"f843a395-6f96-42d2-ad5c-ba62c073da99\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6bbc4" Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.658274 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.658332 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.658368 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.658394 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:12:42 crc kubenswrapper[5072]: E0228 04:12:42.658392 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:12:42 crc kubenswrapper[5072]: E0228 04:12:42.658470 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:12:42 crc kubenswrapper[5072]: E0228 04:12:42.658523 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:12:42 crc kubenswrapper[5072]: E0228 04:12:42.658579 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.719458 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6bbc4" Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.730007 5072 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.738851 5072 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.936200 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6bbc4" event={"ID":"f843a395-6f96-42d2-ad5c-ba62c073da99","Type":"ContainerStarted","Data":"c8ec70bb52a466dcfc04aae3f7f51d4b1a0200c091d5e6d29e6605f226cbb7a5"} Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.936253 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6bbc4" event={"ID":"f843a395-6f96-42d2-ad5c-ba62c073da99","Type":"ContainerStarted","Data":"b4ac6a54198fa1eb469e6156c417d1c173632147df04c4ab9750d8ef83a0cfa7"} Feb 28 04:12:42 crc kubenswrapper[5072]: I0228 04:12:42.950780 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6bbc4" podStartSLOduration=139.950760467 podStartE2EDuration="2m19.950760467s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:12:42.950626003 +0000 UTC m=+184.945356195" watchObservedRunningTime="2026-02-28 04:12:42.950760467 +0000 UTC m=+184.945490659" Feb 28 04:12:43 crc kubenswrapper[5072]: I0228 04:12:43.659174 5072 scope.go:117] "RemoveContainer" containerID="edf6968db2420b0ba248f891852f08cbe7f9e463c9c1c480317d1e388f377057" Feb 28 04:12:43 crc kubenswrapper[5072]: E0228 04:12:43.659346 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kfpqp_openshift-ovn-kubernetes(043491df-2577-47f6-9a5b-03fecada16ce)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" podUID="043491df-2577-47f6-9a5b-03fecada16ce" Feb 28 04:12:43 crc kubenswrapper[5072]: E0228 04:12:43.752610 5072 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 04:12:44 crc kubenswrapper[5072]: I0228 04:12:44.658021 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:12:44 crc kubenswrapper[5072]: I0228 04:12:44.658035 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:44 crc kubenswrapper[5072]: I0228 04:12:44.658075 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:44 crc kubenswrapper[5072]: I0228 04:12:44.658105 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:12:44 crc kubenswrapper[5072]: E0228 04:12:44.658248 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:12:44 crc kubenswrapper[5072]: E0228 04:12:44.658562 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:12:44 crc kubenswrapper[5072]: E0228 04:12:44.658668 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:12:44 crc kubenswrapper[5072]: E0228 04:12:44.658710 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:12:46 crc kubenswrapper[5072]: I0228 04:12:46.657969 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:46 crc kubenswrapper[5072]: I0228 04:12:46.658027 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:12:46 crc kubenswrapper[5072]: I0228 04:12:46.658042 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:46 crc kubenswrapper[5072]: E0228 04:12:46.658123 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:12:46 crc kubenswrapper[5072]: I0228 04:12:46.658160 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:12:46 crc kubenswrapper[5072]: E0228 04:12:46.658289 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:12:46 crc kubenswrapper[5072]: E0228 04:12:46.658339 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:12:46 crc kubenswrapper[5072]: E0228 04:12:46.658406 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:12:48 crc kubenswrapper[5072]: I0228 04:12:48.658881 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:12:48 crc kubenswrapper[5072]: I0228 04:12:48.658942 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:12:48 crc kubenswrapper[5072]: I0228 04:12:48.658942 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:48 crc kubenswrapper[5072]: E0228 04:12:48.659905 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:12:48 crc kubenswrapper[5072]: I0228 04:12:48.659915 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:48 crc kubenswrapper[5072]: E0228 04:12:48.660018 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:12:48 crc kubenswrapper[5072]: E0228 04:12:48.660204 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:12:48 crc kubenswrapper[5072]: E0228 04:12:48.660240 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:12:48 crc kubenswrapper[5072]: E0228 04:12:48.753182 5072 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 04:12:50 crc kubenswrapper[5072]: I0228 04:12:50.658334 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:12:50 crc kubenswrapper[5072]: I0228 04:12:50.658415 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:50 crc kubenswrapper[5072]: I0228 04:12:50.658450 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:12:50 crc kubenswrapper[5072]: E0228 04:12:50.658574 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:12:50 crc kubenswrapper[5072]: I0228 04:12:50.658589 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:50 crc kubenswrapper[5072]: E0228 04:12:50.658676 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:12:50 crc kubenswrapper[5072]: E0228 04:12:50.658722 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:12:50 crc kubenswrapper[5072]: E0228 04:12:50.658789 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:12:52 crc kubenswrapper[5072]: I0228 04:12:52.658055 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:52 crc kubenswrapper[5072]: I0228 04:12:52.658165 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:52 crc kubenswrapper[5072]: E0228 04:12:52.658219 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:12:52 crc kubenswrapper[5072]: I0228 04:12:52.658262 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:12:52 crc kubenswrapper[5072]: I0228 04:12:52.658277 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:12:52 crc kubenswrapper[5072]: E0228 04:12:52.658416 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:12:52 crc kubenswrapper[5072]: E0228 04:12:52.658539 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:12:52 crc kubenswrapper[5072]: E0228 04:12:52.658672 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:12:53 crc kubenswrapper[5072]: E0228 04:12:53.755192 5072 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 04:12:53 crc kubenswrapper[5072]: I0228 04:12:53.971498 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8pz98_ae699423-376d-4342-bf44-7d70f68fadd1/kube-multus/1.log" Feb 28 04:12:53 crc kubenswrapper[5072]: I0228 04:12:53.972044 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8pz98_ae699423-376d-4342-bf44-7d70f68fadd1/kube-multus/0.log" Feb 28 04:12:53 crc kubenswrapper[5072]: I0228 04:12:53.972100 5072 generic.go:334] "Generic (PLEG): container finished" podID="ae699423-376d-4342-bf44-7d70f68fadd1" containerID="f76011dc7c4eafa5461342a10320cb4acfd8b21eeb9293364d11c7a30744e0aa" exitCode=1 Feb 28 04:12:53 crc kubenswrapper[5072]: I0228 04:12:53.972133 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8pz98" event={"ID":"ae699423-376d-4342-bf44-7d70f68fadd1","Type":"ContainerDied","Data":"f76011dc7c4eafa5461342a10320cb4acfd8b21eeb9293364d11c7a30744e0aa"} Feb 28 04:12:53 crc kubenswrapper[5072]: I0228 04:12:53.972165 5072 scope.go:117] "RemoveContainer" containerID="b43b4a121ee19e7464d3f3f18bb81005ed7a19288dfe7701487f608655bf78d7" Feb 28 04:12:53 crc kubenswrapper[5072]: I0228 04:12:53.973936 5072 scope.go:117] "RemoveContainer" containerID="f76011dc7c4eafa5461342a10320cb4acfd8b21eeb9293364d11c7a30744e0aa" Feb 28 04:12:53 crc kubenswrapper[5072]: E0228 04:12:53.974264 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-8pz98_openshift-multus(ae699423-376d-4342-bf44-7d70f68fadd1)\"" pod="openshift-multus/multus-8pz98" podUID="ae699423-376d-4342-bf44-7d70f68fadd1" Feb 28 04:12:54 crc kubenswrapper[5072]: I0228 04:12:54.658923 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:12:54 crc kubenswrapper[5072]: E0228 04:12:54.659081 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:12:54 crc kubenswrapper[5072]: I0228 04:12:54.659194 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:12:54 crc kubenswrapper[5072]: I0228 04:12:54.659473 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:54 crc kubenswrapper[5072]: E0228 04:12:54.659561 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:12:54 crc kubenswrapper[5072]: I0228 04:12:54.659562 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:54 crc kubenswrapper[5072]: E0228 04:12:54.659773 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:12:54 crc kubenswrapper[5072]: E0228 04:12:54.659872 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:12:54 crc kubenswrapper[5072]: I0228 04:12:54.659893 5072 scope.go:117] "RemoveContainer" containerID="edf6968db2420b0ba248f891852f08cbe7f9e463c9c1c480317d1e388f377057" Feb 28 04:12:54 crc kubenswrapper[5072]: E0228 04:12:54.660086 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kfpqp_openshift-ovn-kubernetes(043491df-2577-47f6-9a5b-03fecada16ce)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" podUID="043491df-2577-47f6-9a5b-03fecada16ce" Feb 28 04:12:54 crc kubenswrapper[5072]: I0228 04:12:54.976158 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8pz98_ae699423-376d-4342-bf44-7d70f68fadd1/kube-multus/1.log" Feb 28 04:12:56 crc kubenswrapper[5072]: I0228 04:12:56.658678 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:12:56 crc kubenswrapper[5072]: I0228 04:12:56.658738 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:56 crc kubenswrapper[5072]: I0228 04:12:56.658678 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:56 crc kubenswrapper[5072]: E0228 04:12:56.658848 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:12:56 crc kubenswrapper[5072]: I0228 04:12:56.659062 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:12:56 crc kubenswrapper[5072]: E0228 04:12:56.659049 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:12:56 crc kubenswrapper[5072]: E0228 04:12:56.659125 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:12:56 crc kubenswrapper[5072]: E0228 04:12:56.659205 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:12:58 crc kubenswrapper[5072]: I0228 04:12:58.658908 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:12:58 crc kubenswrapper[5072]: I0228 04:12:58.658914 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:12:58 crc kubenswrapper[5072]: I0228 04:12:58.659030 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:12:58 crc kubenswrapper[5072]: E0228 04:12:58.659890 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:12:58 crc kubenswrapper[5072]: I0228 04:12:58.659910 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:12:58 crc kubenswrapper[5072]: E0228 04:12:58.660052 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:12:58 crc kubenswrapper[5072]: E0228 04:12:58.660102 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:12:58 crc kubenswrapper[5072]: E0228 04:12:58.660135 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:12:58 crc kubenswrapper[5072]: E0228 04:12:58.756047 5072 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 04:13:00 crc kubenswrapper[5072]: I0228 04:13:00.658052 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:13:00 crc kubenswrapper[5072]: I0228 04:13:00.658136 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:13:00 crc kubenswrapper[5072]: E0228 04:13:00.658185 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:13:00 crc kubenswrapper[5072]: I0228 04:13:00.658067 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:13:00 crc kubenswrapper[5072]: I0228 04:13:00.658075 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:13:00 crc kubenswrapper[5072]: E0228 04:13:00.658316 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:13:00 crc kubenswrapper[5072]: E0228 04:13:00.658320 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:13:00 crc kubenswrapper[5072]: E0228 04:13:00.658370 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:13:02 crc kubenswrapper[5072]: I0228 04:13:02.658937 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:13:02 crc kubenswrapper[5072]: E0228 04:13:02.659148 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:13:02 crc kubenswrapper[5072]: I0228 04:13:02.659223 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:13:02 crc kubenswrapper[5072]: I0228 04:13:02.659430 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:13:02 crc kubenswrapper[5072]: E0228 04:13:02.659506 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:13:02 crc kubenswrapper[5072]: E0228 04:13:02.659419 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:13:02 crc kubenswrapper[5072]: I0228 04:13:02.659874 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:13:02 crc kubenswrapper[5072]: E0228 04:13:02.659999 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:13:03 crc kubenswrapper[5072]: E0228 04:13:03.757744 5072 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 04:13:04 crc kubenswrapper[5072]: I0228 04:13:04.658229 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:13:04 crc kubenswrapper[5072]: I0228 04:13:04.658312 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:13:04 crc kubenswrapper[5072]: I0228 04:13:04.658337 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:13:04 crc kubenswrapper[5072]: E0228 04:13:04.658385 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:13:04 crc kubenswrapper[5072]: E0228 04:13:04.658490 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:13:04 crc kubenswrapper[5072]: I0228 04:13:04.658636 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:13:04 crc kubenswrapper[5072]: E0228 04:13:04.658689 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:13:04 crc kubenswrapper[5072]: E0228 04:13:04.658756 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:13:05 crc kubenswrapper[5072]: I0228 04:13:05.659050 5072 scope.go:117] "RemoveContainer" containerID="edf6968db2420b0ba248f891852f08cbe7f9e463c9c1c480317d1e388f377057" Feb 28 04:13:06 crc kubenswrapper[5072]: I0228 04:13:06.011330 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kfpqp_043491df-2577-47f6-9a5b-03fecada16ce/ovnkube-controller/3.log" Feb 28 04:13:06 crc kubenswrapper[5072]: I0228 04:13:06.013548 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" event={"ID":"043491df-2577-47f6-9a5b-03fecada16ce","Type":"ContainerStarted","Data":"d9ead4e3700bb4d4844b72459a0b5e14216c9b711ffa9ed2f2930565eca72f37"} Feb 28 04:13:06 crc kubenswrapper[5072]: I0228 04:13:06.014017 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:13:06 crc kubenswrapper[5072]: I0228 04:13:06.041394 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" podStartSLOduration=163.041363609 podStartE2EDuration="2m43.041363609s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:06.040553464 +0000 UTC m=+208.035283666" watchObservedRunningTime="2026-02-28 04:13:06.041363609 +0000 UTC m=+208.036093811" Feb 28 04:13:06 crc kubenswrapper[5072]: I0228 04:13:06.550954 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-95gbg"] Feb 28 04:13:06 crc kubenswrapper[5072]: I0228 04:13:06.551093 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:13:06 crc kubenswrapper[5072]: E0228 04:13:06.551235 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:13:06 crc kubenswrapper[5072]: I0228 04:13:06.658510 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:13:06 crc kubenswrapper[5072]: I0228 04:13:06.658554 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:13:06 crc kubenswrapper[5072]: I0228 04:13:06.658623 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:13:06 crc kubenswrapper[5072]: E0228 04:13:06.658688 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:13:06 crc kubenswrapper[5072]: E0228 04:13:06.658773 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:13:06 crc kubenswrapper[5072]: E0228 04:13:06.658912 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:13:06 crc kubenswrapper[5072]: I0228 04:13:06.659094 5072 scope.go:117] "RemoveContainer" containerID="f76011dc7c4eafa5461342a10320cb4acfd8b21eeb9293364d11c7a30744e0aa" Feb 28 04:13:07 crc kubenswrapper[5072]: I0228 04:13:07.018455 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8pz98_ae699423-376d-4342-bf44-7d70f68fadd1/kube-multus/1.log" Feb 28 04:13:07 crc kubenswrapper[5072]: I0228 04:13:07.019415 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8pz98" event={"ID":"ae699423-376d-4342-bf44-7d70f68fadd1","Type":"ContainerStarted","Data":"7e741ee8743c0d8ce1eff62104ff5ebc16f00b9727d0e70f0a3c873cc615ed38"} Feb 28 04:13:08 crc kubenswrapper[5072]: I0228 04:13:08.658325 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:13:08 crc kubenswrapper[5072]: I0228 04:13:08.658361 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:13:08 crc kubenswrapper[5072]: I0228 04:13:08.658407 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:13:08 crc kubenswrapper[5072]: I0228 04:13:08.658443 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:13:08 crc kubenswrapper[5072]: E0228 04:13:08.659866 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 04:13:08 crc kubenswrapper[5072]: E0228 04:13:08.660019 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 04:13:08 crc kubenswrapper[5072]: E0228 04:13:08.660086 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 04:13:08 crc kubenswrapper[5072]: E0228 04:13:08.660234 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-95gbg" podUID="109581ed-36ab-4625-bf7e-bcdecb30e35a" Feb 28 04:13:10 crc kubenswrapper[5072]: I0228 04:13:10.658390 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:13:10 crc kubenswrapper[5072]: I0228 04:13:10.658447 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:13:10 crc kubenswrapper[5072]: I0228 04:13:10.658420 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:13:10 crc kubenswrapper[5072]: I0228 04:13:10.658393 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:13:10 crc kubenswrapper[5072]: I0228 04:13:10.660558 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 28 04:13:10 crc kubenswrapper[5072]: I0228 04:13:10.660786 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 28 04:13:10 crc kubenswrapper[5072]: I0228 04:13:10.661098 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 28 04:13:10 crc kubenswrapper[5072]: I0228 04:13:10.661297 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 28 04:13:10 crc kubenswrapper[5072]: I0228 04:13:10.661586 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 28 04:13:10 crc kubenswrapper[5072]: I0228 04:13:10.663008 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 28 04:13:12 crc kubenswrapper[5072]: I0228 04:13:12.456510 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:13:12 crc kubenswrapper[5072]: I0228 04:13:12.457863 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:13:12 crc kubenswrapper[5072]: I0228 04:13:12.558329 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:12 crc kubenswrapper[5072]: I0228 04:13:12.558575 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:13:12 crc kubenswrapper[5072]: E0228 04:13:12.558753 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:15:14.558699382 +0000 UTC m=+336.553429614 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:12 crc kubenswrapper[5072]: I0228 04:13:12.558915 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:13:12 crc kubenswrapper[5072]: I0228 04:13:12.559072 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:13:12 crc kubenswrapper[5072]: I0228 04:13:12.565541 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:13:12 crc kubenswrapper[5072]: I0228 04:13:12.567468 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:13:12 crc kubenswrapper[5072]: I0228 04:13:12.570572 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:13:12 crc kubenswrapper[5072]: I0228 04:13:12.777856 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 04:13:12 crc kubenswrapper[5072]: I0228 04:13:12.786005 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 04:13:12 crc kubenswrapper[5072]: I0228 04:13:12.791118 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.037079 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"eefc5eeb641a9cb81f3c8902597ca03125676101c182adfca7b36fb55c98b32d"} Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.134687 5072 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.186561 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbm42"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.187586 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-tdnkt"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.188279 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbm42" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.190778 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bsswj"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.190923 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdnkt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.192714 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-j55ft"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.192882 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-bsswj" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.193947 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9vfgz"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.194224 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j55ft" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.195239 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-9vfgz" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.197172 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-l4r96"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.197752 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4r96" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.199420 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-49m8v"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.201033 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-49m8v" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.201718 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-mzmcb"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.202423 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-mzmcb" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.202877 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4d9s6"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.203443 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.204285 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dcccb"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.204780 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dcccb" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.210530 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.211319 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.211630 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kmrvp"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.212477 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kmrvp" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.216671 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.218490 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.218998 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.219232 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.219366 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.219487 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.219607 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.219789 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.240731 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-sb9bc"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.264057 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t44gr"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.264503 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t44gr" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.265054 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-dsh4l"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.265281 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.266067 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8p4zs"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.266762 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8p4zs" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.266095 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-dsh4l" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.268254 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1e546a4-808d-4c86-a8ab-274186b278a6-serving-cert\") pod \"apiserver-7bbb656c7d-qvn7m\" (UID: \"b1e546a4-808d-4c86-a8ab-274186b278a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.268292 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c36ce709-c726-4390-abb9-2ebcaecbf1c0-config\") pod \"console-operator-58897d9998-49m8v\" (UID: \"c36ce709-c726-4390-abb9-2ebcaecbf1c0\") " pod="openshift-console-operator/console-operator-58897d9998-49m8v" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.268313 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-image-import-ca\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.268337 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.268362 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clzpw\" (UniqueName: \"kubernetes.io/projected/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-kube-api-access-clzpw\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.268400 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-audit-dir\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.268424 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/497b9208-4958-46e8-8aeb-8bc2e0f172d6-audit-dir\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.268447 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.268473 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0aaa88f-ecf9-47b0-9349-737e855a9ed4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bsswj\" (UID: \"f0aaa88f-ecf9-47b0-9349-737e855a9ed4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bsswj" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.268494 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-node-pullsecrets\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.268516 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rbdz\" (UniqueName: \"kubernetes.io/projected/dbdad8a2-b26c-4587-8a9c-cdf96b65c15f-kube-api-access-2rbdz\") pod \"downloads-7954f5f757-mzmcb\" (UID: \"dbdad8a2-b26c-4587-8a9c-cdf96b65c15f\") " pod="openshift-console/downloads-7954f5f757-mzmcb" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.268536 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkwwt\" (UniqueName: \"kubernetes.io/projected/f0aaa88f-ecf9-47b0-9349-737e855a9ed4-kube-api-access-hkwwt\") pod \"authentication-operator-69f744f599-bsswj\" (UID: \"f0aaa88f-ecf9-47b0-9349-737e855a9ed4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bsswj" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.268556 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7fgw\" (UniqueName: \"kubernetes.io/projected/497b9208-4958-46e8-8aeb-8bc2e0f172d6-kube-api-access-p7fgw\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.268571 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b1e546a4-808d-4c86-a8ab-274186b278a6-audit-policies\") pod \"apiserver-7bbb656c7d-qvn7m\" (UID: \"b1e546a4-808d-4c86-a8ab-274186b278a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.268587 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b41acd5-2f6b-48fd-a9ba-55796e6db653-config\") pod \"controller-manager-879f6c89f-dcccb\" (UID: \"3b41acd5-2f6b-48fd-a9ba-55796e6db653\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dcccb" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.268606 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5047b83c-e577-419c-b2e0-05c6f7baaef3-config\") pod \"machine-approver-56656f9798-tdnkt\" (UID: \"5047b83c-e577-419c-b2e0-05c6f7baaef3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdnkt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.268624 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-etcd-serving-ca\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.268661 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b1e546a4-808d-4c86-a8ab-274186b278a6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qvn7m\" (UID: \"b1e546a4-808d-4c86-a8ab-274186b278a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.268686 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdhtf\" (UniqueName: \"kubernetes.io/projected/c7d7baf9-5adb-4cc1-a07b-d093bcd78e30-kube-api-access-pdhtf\") pod \"route-controller-manager-6576b87f9c-j55ft\" (UID: \"c7d7baf9-5adb-4cc1-a07b-d093bcd78e30\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j55ft" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.268715 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.268742 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7vf5\" (UniqueName: \"kubernetes.io/projected/3b41acd5-2f6b-48fd-a9ba-55796e6db653-kube-api-access-x7vf5\") pod \"controller-manager-879f6c89f-dcccb\" (UID: \"3b41acd5-2f6b-48fd-a9ba-55796e6db653\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dcccb" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.268769 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2b10a22-bf51-42f1-82e7-23b09dc84d3a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-t44gr\" (UID: \"a2b10a22-bf51-42f1-82e7-23b09dc84d3a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t44gr" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.268806 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.268848 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwftl\" (UniqueName: \"kubernetes.io/projected/280cee64-7ec9-4dd4-9fb8-931b2d7a5818-kube-api-access-pwftl\") pod \"openshift-apiserver-operator-796bbdcf4f-kmrvp\" (UID: \"280cee64-7ec9-4dd4-9fb8-931b2d7a5818\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kmrvp" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.268888 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b41acd5-2f6b-48fd-a9ba-55796e6db653-client-ca\") pod \"controller-manager-879f6c89f-dcccb\" (UID: \"3b41acd5-2f6b-48fd-a9ba-55796e6db653\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dcccb" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.268914 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5047b83c-e577-419c-b2e0-05c6f7baaef3-machine-approver-tls\") pod \"machine-approver-56656f9798-tdnkt\" (UID: \"5047b83c-e577-419c-b2e0-05c6f7baaef3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdnkt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.268939 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1e546a4-808d-4c86-a8ab-274186b278a6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qvn7m\" (UID: \"b1e546a4-808d-4c86-a8ab-274186b278a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.268955 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.268976 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krkfq\" (UniqueName: \"kubernetes.io/projected/707bbe1d-eb9e-4d9d-8e70-e88429b8c077-kube-api-access-krkfq\") pod \"machine-api-operator-5694c8668f-9vfgz\" (UID: \"707bbe1d-eb9e-4d9d-8e70-e88429b8c077\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9vfgz" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.268992 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269009 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/280cee64-7ec9-4dd4-9fb8-931b2d7a5818-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-kmrvp\" (UID: \"280cee64-7ec9-4dd4-9fb8-931b2d7a5818\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kmrvp" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269028 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-encryption-config\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269044 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b1e546a4-808d-4c86-a8ab-274186b278a6-encryption-config\") pod \"apiserver-7bbb656c7d-qvn7m\" (UID: \"b1e546a4-808d-4c86-a8ab-274186b278a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269060 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269077 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9035860a-a3e9-439c-bfed-89b060a0bdc5-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fbm42\" (UID: \"9035860a-a3e9-439c-bfed-89b060a0bdc5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbm42" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269091 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/707bbe1d-eb9e-4d9d-8e70-e88429b8c077-config\") pod \"machine-api-operator-5694c8668f-9vfgz\" (UID: \"707bbe1d-eb9e-4d9d-8e70-e88429b8c077\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9vfgz" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269108 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g8qn\" (UniqueName: \"kubernetes.io/projected/c36ce709-c726-4390-abb9-2ebcaecbf1c0-kube-api-access-4g8qn\") pod \"console-operator-58897d9998-49m8v\" (UID: \"c36ce709-c726-4390-abb9-2ebcaecbf1c0\") " pod="openshift-console-operator/console-operator-58897d9998-49m8v" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269124 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/707bbe1d-eb9e-4d9d-8e70-e88429b8c077-images\") pod \"machine-api-operator-5694c8668f-9vfgz\" (UID: \"707bbe1d-eb9e-4d9d-8e70-e88429b8c077\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9vfgz" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269143 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269166 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37da02ec-0d39-471d-93ea-5cca2236656d-serving-cert\") pod \"openshift-config-operator-7777fb866f-l4r96\" (UID: \"37da02ec-0d39-471d-93ea-5cca2236656d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4r96" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269193 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269213 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jwjd\" (UniqueName: \"kubernetes.io/projected/37da02ec-0d39-471d-93ea-5cca2236656d-kube-api-access-5jwjd\") pod \"openshift-config-operator-7777fb866f-l4r96\" (UID: \"37da02ec-0d39-471d-93ea-5cca2236656d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4r96" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269231 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0aaa88f-ecf9-47b0-9349-737e855a9ed4-service-ca-bundle\") pod \"authentication-operator-69f744f599-bsswj\" (UID: \"f0aaa88f-ecf9-47b0-9349-737e855a9ed4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bsswj" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269252 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2b10a22-bf51-42f1-82e7-23b09dc84d3a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-t44gr\" (UID: \"a2b10a22-bf51-42f1-82e7-23b09dc84d3a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t44gr" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269270 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfsq4\" (UniqueName: \"kubernetes.io/projected/a2b10a22-bf51-42f1-82e7-23b09dc84d3a-kube-api-access-dfsq4\") pod \"openshift-controller-manager-operator-756b6f6bc6-t44gr\" (UID: \"a2b10a22-bf51-42f1-82e7-23b09dc84d3a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t44gr" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269287 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c36ce709-c726-4390-abb9-2ebcaecbf1c0-serving-cert\") pod \"console-operator-58897d9998-49m8v\" (UID: \"c36ce709-c726-4390-abb9-2ebcaecbf1c0\") " pod="openshift-console-operator/console-operator-58897d9998-49m8v" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269307 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269324 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/280cee64-7ec9-4dd4-9fb8-931b2d7a5818-config\") pod \"openshift-apiserver-operator-796bbdcf4f-kmrvp\" (UID: \"280cee64-7ec9-4dd4-9fb8-931b2d7a5818\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kmrvp" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269342 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7d7baf9-5adb-4cc1-a07b-d093bcd78e30-serving-cert\") pod \"route-controller-manager-6576b87f9c-j55ft\" (UID: \"c7d7baf9-5adb-4cc1-a07b-d093bcd78e30\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j55ft" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269358 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-audit\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269374 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0aaa88f-ecf9-47b0-9349-737e855a9ed4-config\") pod \"authentication-operator-69f744f599-bsswj\" (UID: \"f0aaa88f-ecf9-47b0-9349-737e855a9ed4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bsswj" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269391 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b1e546a4-808d-4c86-a8ab-274186b278a6-etcd-client\") pod \"apiserver-7bbb656c7d-qvn7m\" (UID: \"b1e546a4-808d-4c86-a8ab-274186b278a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269405 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/707bbe1d-eb9e-4d9d-8e70-e88429b8c077-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9vfgz\" (UID: \"707bbe1d-eb9e-4d9d-8e70-e88429b8c077\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9vfgz" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269420 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/497b9208-4958-46e8-8aeb-8bc2e0f172d6-audit-policies\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269437 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269453 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ld8g\" (UniqueName: \"kubernetes.io/projected/9035860a-a3e9-439c-bfed-89b060a0bdc5-kube-api-access-2ld8g\") pod \"cluster-samples-operator-665b6dd947-fbm42\" (UID: \"9035860a-a3e9-439c-bfed-89b060a0bdc5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbm42" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269471 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3b41acd5-2f6b-48fd-a9ba-55796e6db653-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dcccb\" (UID: \"3b41acd5-2f6b-48fd-a9ba-55796e6db653\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dcccb" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269487 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269505 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7d7baf9-5adb-4cc1-a07b-d093bcd78e30-config\") pod \"route-controller-manager-6576b87f9c-j55ft\" (UID: \"c7d7baf9-5adb-4cc1-a07b-d093bcd78e30\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j55ft" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269521 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjcdw\" (UniqueName: \"kubernetes.io/projected/b1e546a4-808d-4c86-a8ab-274186b278a6-kube-api-access-sjcdw\") pod \"apiserver-7bbb656c7d-qvn7m\" (UID: \"b1e546a4-808d-4c86-a8ab-274186b278a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269537 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7d7baf9-5adb-4cc1-a07b-d093bcd78e30-client-ca\") pod \"route-controller-manager-6576b87f9c-j55ft\" (UID: \"c7d7baf9-5adb-4cc1-a07b-d093bcd78e30\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j55ft" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269553 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c36ce709-c726-4390-abb9-2ebcaecbf1c0-trusted-ca\") pod \"console-operator-58897d9998-49m8v\" (UID: \"c36ce709-c726-4390-abb9-2ebcaecbf1c0\") " pod="openshift-console-operator/console-operator-58897d9998-49m8v" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269571 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5047b83c-e577-419c-b2e0-05c6f7baaef3-auth-proxy-config\") pod \"machine-approver-56656f9798-tdnkt\" (UID: \"5047b83c-e577-419c-b2e0-05c6f7baaef3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdnkt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269587 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2vsq\" (UniqueName: \"kubernetes.io/projected/5047b83c-e577-419c-b2e0-05c6f7baaef3-kube-api-access-w2vsq\") pod \"machine-approver-56656f9798-tdnkt\" (UID: \"5047b83c-e577-419c-b2e0-05c6f7baaef3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdnkt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269602 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-config\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269619 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b41acd5-2f6b-48fd-a9ba-55796e6db653-serving-cert\") pod \"controller-manager-879f6c89f-dcccb\" (UID: \"3b41acd5-2f6b-48fd-a9ba-55796e6db653\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dcccb" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269634 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0aaa88f-ecf9-47b0-9349-737e855a9ed4-serving-cert\") pod \"authentication-operator-69f744f599-bsswj\" (UID: \"f0aaa88f-ecf9-47b0-9349-737e855a9ed4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bsswj" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269668 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/37da02ec-0d39-471d-93ea-5cca2236656d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-l4r96\" (UID: \"37da02ec-0d39-471d-93ea-5cca2236656d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4r96" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269684 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b1e546a4-808d-4c86-a8ab-274186b278a6-audit-dir\") pod \"apiserver-7bbb656c7d-qvn7m\" (UID: \"b1e546a4-808d-4c86-a8ab-274186b278a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269703 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-etcd-client\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.269727 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-serving-cert\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.272352 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-58vm7"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.273535 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-58vm7" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.273706 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-x4ltm"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.278206 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbf7l"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.278597 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbf7l" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.279168 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-x4ltm" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.307753 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.308070 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.308330 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.308755 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.308889 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.308909 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.309066 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.309169 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.309226 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.309474 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.309675 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.309798 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.309886 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.310056 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.310169 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jrbwn"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.310195 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.310346 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.310467 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.309910 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.310966 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.311012 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.311131 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.311168 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.311247 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.311260 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.311432 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.311456 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.311254 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.311549 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.311241 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.311680 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.311761 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.312169 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.312485 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.312731 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.312952 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.313144 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.313310 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.313331 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.313460 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.313485 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.313038 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.313608 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.313691 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.313771 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.313793 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.313846 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.313777 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.314003 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.314688 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.314726 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.314692 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.315008 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.315142 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.315157 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.315195 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.315554 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.315663 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.315679 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.315705 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.315710 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.315573 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.315632 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.315793 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.315840 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.315843 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.315907 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.315934 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.315952 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.315976 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.316148 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.316255 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.316279 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.316418 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.316801 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.316830 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.316845 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.317188 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.318058 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n59st"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.318249 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrbwn" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.318620 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-flgmc"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.318706 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n59st" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.319140 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dsp7s"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.319365 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-flgmc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.319532 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zc5mk"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.319699 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dsp7s" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.319831 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nglpx"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.320182 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nglpx" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.321236 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zc5mk" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.325157 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.325424 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.325546 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.329487 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.329986 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.330087 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.330561 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.339367 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.339848 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.340147 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.340637 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.341543 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wl8mf"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.342336 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gf64p"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.342913 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bw85j"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.343399 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.343868 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wl8mf" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.344163 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gf64p" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.366991 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.369559 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.391706 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0aaa88f-ecf9-47b0-9349-737e855a9ed4-service-ca-bundle\") pod \"authentication-operator-69f744f599-bsswj\" (UID: \"f0aaa88f-ecf9-47b0-9349-737e855a9ed4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bsswj" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.391746 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjmvc\" (UniqueName: \"kubernetes.io/projected/99ab1bd5-e727-47e3-8f77-78f5a5795c7c-kube-api-access-hjmvc\") pod \"ingress-operator-5b745b69d9-gf64p\" (UID: \"99ab1bd5-e727-47e3-8f77-78f5a5795c7c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gf64p" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.391769 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2b10a22-bf51-42f1-82e7-23b09dc84d3a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-t44gr\" (UID: \"a2b10a22-bf51-42f1-82e7-23b09dc84d3a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t44gr" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.391787 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfsq4\" (UniqueName: \"kubernetes.io/projected/a2b10a22-bf51-42f1-82e7-23b09dc84d3a-kube-api-access-dfsq4\") pod \"openshift-controller-manager-operator-756b6f6bc6-t44gr\" (UID: \"a2b10a22-bf51-42f1-82e7-23b09dc84d3a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t44gr" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.391810 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bc940ed-4de2-4e88-98d7-8d9de59cd63d-trusted-ca-bundle\") pod \"console-f9d7485db-58vm7\" (UID: \"0bc940ed-4de2-4e88-98d7-8d9de59cd63d\") " pod="openshift-console/console-f9d7485db-58vm7" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.391830 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/280cee64-7ec9-4dd4-9fb8-931b2d7a5818-config\") pod \"openshift-apiserver-operator-796bbdcf4f-kmrvp\" (UID: \"280cee64-7ec9-4dd4-9fb8-931b2d7a5818\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kmrvp" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.391848 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fb104aa-1ec4-4941-acfb-8d8458df7d4a-proxy-tls\") pod \"machine-config-operator-74547568cd-jrbwn\" (UID: \"0fb104aa-1ec4-4941-acfb-8d8458df7d4a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrbwn" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.391865 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c36ce709-c726-4390-abb9-2ebcaecbf1c0-serving-cert\") pod \"console-operator-58897d9998-49m8v\" (UID: \"c36ce709-c726-4390-abb9-2ebcaecbf1c0\") " pod="openshift-console-operator/console-operator-58897d9998-49m8v" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.391883 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.391901 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7d7baf9-5adb-4cc1-a07b-d093bcd78e30-serving-cert\") pod \"route-controller-manager-6576b87f9c-j55ft\" (UID: \"c7d7baf9-5adb-4cc1-a07b-d093bcd78e30\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j55ft" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.391919 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-audit\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.391935 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0aaa88f-ecf9-47b0-9349-737e855a9ed4-config\") pod \"authentication-operator-69f744f599-bsswj\" (UID: \"f0aaa88f-ecf9-47b0-9349-737e855a9ed4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bsswj" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.391949 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.391964 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ld8g\" (UniqueName: \"kubernetes.io/projected/9035860a-a3e9-439c-bfed-89b060a0bdc5-kube-api-access-2ld8g\") pod \"cluster-samples-operator-665b6dd947-fbm42\" (UID: \"9035860a-a3e9-439c-bfed-89b060a0bdc5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbm42" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.391983 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b1e546a4-808d-4c86-a8ab-274186b278a6-etcd-client\") pod \"apiserver-7bbb656c7d-qvn7m\" (UID: \"b1e546a4-808d-4c86-a8ab-274186b278a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.391997 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/707bbe1d-eb9e-4d9d-8e70-e88429b8c077-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9vfgz\" (UID: \"707bbe1d-eb9e-4d9d-8e70-e88429b8c077\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9vfgz" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392016 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/497b9208-4958-46e8-8aeb-8bc2e0f172d6-audit-policies\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392030 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7d7baf9-5adb-4cc1-a07b-d093bcd78e30-config\") pod \"route-controller-manager-6576b87f9c-j55ft\" (UID: \"c7d7baf9-5adb-4cc1-a07b-d093bcd78e30\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j55ft" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392045 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3b41acd5-2f6b-48fd-a9ba-55796e6db653-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dcccb\" (UID: \"3b41acd5-2f6b-48fd-a9ba-55796e6db653\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dcccb" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392061 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392076 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c36ce709-c726-4390-abb9-2ebcaecbf1c0-trusted-ca\") pod \"console-operator-58897d9998-49m8v\" (UID: \"c36ce709-c726-4390-abb9-2ebcaecbf1c0\") " pod="openshift-console-operator/console-operator-58897d9998-49m8v" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392092 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5047b83c-e577-419c-b2e0-05c6f7baaef3-auth-proxy-config\") pod \"machine-approver-56656f9798-tdnkt\" (UID: \"5047b83c-e577-419c-b2e0-05c6f7baaef3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdnkt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392114 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjcdw\" (UniqueName: \"kubernetes.io/projected/b1e546a4-808d-4c86-a8ab-274186b278a6-kube-api-access-sjcdw\") pod \"apiserver-7bbb656c7d-qvn7m\" (UID: \"b1e546a4-808d-4c86-a8ab-274186b278a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392136 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7d7baf9-5adb-4cc1-a07b-d093bcd78e30-client-ca\") pod \"route-controller-manager-6576b87f9c-j55ft\" (UID: \"c7d7baf9-5adb-4cc1-a07b-d093bcd78e30\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j55ft" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392159 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2vsq\" (UniqueName: \"kubernetes.io/projected/5047b83c-e577-419c-b2e0-05c6f7baaef3-kube-api-access-w2vsq\") pod \"machine-approver-56656f9798-tdnkt\" (UID: \"5047b83c-e577-419c-b2e0-05c6f7baaef3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdnkt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392181 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-config\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392203 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99ab1bd5-e727-47e3-8f77-78f5a5795c7c-metrics-tls\") pod \"ingress-operator-5b745b69d9-gf64p\" (UID: \"99ab1bd5-e727-47e3-8f77-78f5a5795c7c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gf64p" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392225 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b1e546a4-808d-4c86-a8ab-274186b278a6-audit-dir\") pod \"apiserver-7bbb656c7d-qvn7m\" (UID: \"b1e546a4-808d-4c86-a8ab-274186b278a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392245 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b41acd5-2f6b-48fd-a9ba-55796e6db653-serving-cert\") pod \"controller-manager-879f6c89f-dcccb\" (UID: \"3b41acd5-2f6b-48fd-a9ba-55796e6db653\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dcccb" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392266 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0aaa88f-ecf9-47b0-9349-737e855a9ed4-serving-cert\") pod \"authentication-operator-69f744f599-bsswj\" (UID: \"f0aaa88f-ecf9-47b0-9349-737e855a9ed4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bsswj" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392286 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/37da02ec-0d39-471d-93ea-5cca2236656d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-l4r96\" (UID: \"37da02ec-0d39-471d-93ea-5cca2236656d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4r96" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392305 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-etcd-client\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392328 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-serving-cert\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392347 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0bc940ed-4de2-4e88-98d7-8d9de59cd63d-service-ca\") pod \"console-f9d7485db-58vm7\" (UID: \"0bc940ed-4de2-4e88-98d7-8d9de59cd63d\") " pod="openshift-console/console-f9d7485db-58vm7" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392362 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/99ab1bd5-e727-47e3-8f77-78f5a5795c7c-trusted-ca\") pod \"ingress-operator-5b745b69d9-gf64p\" (UID: \"99ab1bd5-e727-47e3-8f77-78f5a5795c7c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gf64p" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392397 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-image-import-ca\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392415 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392431 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1e546a4-808d-4c86-a8ab-274186b278a6-serving-cert\") pod \"apiserver-7bbb656c7d-qvn7m\" (UID: \"b1e546a4-808d-4c86-a8ab-274186b278a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392444 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c36ce709-c726-4390-abb9-2ebcaecbf1c0-config\") pod \"console-operator-58897d9998-49m8v\" (UID: \"c36ce709-c726-4390-abb9-2ebcaecbf1c0\") " pod="openshift-console-operator/console-operator-58897d9998-49m8v" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392461 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rczk\" (UniqueName: \"kubernetes.io/projected/0fb104aa-1ec4-4941-acfb-8d8458df7d4a-kube-api-access-6rczk\") pod \"machine-config-operator-74547568cd-jrbwn\" (UID: \"0fb104aa-1ec4-4941-acfb-8d8458df7d4a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrbwn" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392480 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clzpw\" (UniqueName: \"kubernetes.io/projected/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-kube-api-access-clzpw\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392514 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-audit-dir\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392541 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/497b9208-4958-46e8-8aeb-8bc2e0f172d6-audit-dir\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392563 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392610 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0bc940ed-4de2-4e88-98d7-8d9de59cd63d-console-oauth-config\") pod \"console-f9d7485db-58vm7\" (UID: \"0bc940ed-4de2-4e88-98d7-8d9de59cd63d\") " pod="openshift-console/console-f9d7485db-58vm7" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392656 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rbdz\" (UniqueName: \"kubernetes.io/projected/dbdad8a2-b26c-4587-8a9c-cdf96b65c15f-kube-api-access-2rbdz\") pod \"downloads-7954f5f757-mzmcb\" (UID: \"dbdad8a2-b26c-4587-8a9c-cdf96b65c15f\") " pod="openshift-console/downloads-7954f5f757-mzmcb" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392695 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0aaa88f-ecf9-47b0-9349-737e855a9ed4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bsswj\" (UID: \"f0aaa88f-ecf9-47b0-9349-737e855a9ed4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bsswj" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392717 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-node-pullsecrets\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392749 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkwwt\" (UniqueName: \"kubernetes.io/projected/f0aaa88f-ecf9-47b0-9349-737e855a9ed4-kube-api-access-hkwwt\") pod \"authentication-operator-69f744f599-bsswj\" (UID: \"f0aaa88f-ecf9-47b0-9349-737e855a9ed4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bsswj" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392770 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7fgw\" (UniqueName: \"kubernetes.io/projected/497b9208-4958-46e8-8aeb-8bc2e0f172d6-kube-api-access-p7fgw\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392790 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b1e546a4-808d-4c86-a8ab-274186b278a6-audit-policies\") pod \"apiserver-7bbb656c7d-qvn7m\" (UID: \"b1e546a4-808d-4c86-a8ab-274186b278a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392810 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b41acd5-2f6b-48fd-a9ba-55796e6db653-config\") pod \"controller-manager-879f6c89f-dcccb\" (UID: \"3b41acd5-2f6b-48fd-a9ba-55796e6db653\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dcccb" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392834 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2hvm\" (UniqueName: \"kubernetes.io/projected/39c4f31f-248a-42ad-9aa0-166f036be3ac-kube-api-access-c2hvm\") pod \"migrator-59844c95c7-flgmc\" (UID: \"39c4f31f-248a-42ad-9aa0-166f036be3ac\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-flgmc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392860 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5047b83c-e577-419c-b2e0-05c6f7baaef3-config\") pod \"machine-approver-56656f9798-tdnkt\" (UID: \"5047b83c-e577-419c-b2e0-05c6f7baaef3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdnkt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392882 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-etcd-serving-ca\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392903 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b1e546a4-808d-4c86-a8ab-274186b278a6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qvn7m\" (UID: \"b1e546a4-808d-4c86-a8ab-274186b278a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392936 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdhtf\" (UniqueName: \"kubernetes.io/projected/c7d7baf9-5adb-4cc1-a07b-d093bcd78e30-kube-api-access-pdhtf\") pod \"route-controller-manager-6576b87f9c-j55ft\" (UID: \"c7d7baf9-5adb-4cc1-a07b-d093bcd78e30\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j55ft" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392959 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392982 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/99ab1bd5-e727-47e3-8f77-78f5a5795c7c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gf64p\" (UID: \"99ab1bd5-e727-47e3-8f77-78f5a5795c7c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gf64p" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.392999 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bc940ed-4de2-4e88-98d7-8d9de59cd63d-console-serving-cert\") pod \"console-f9d7485db-58vm7\" (UID: \"0bc940ed-4de2-4e88-98d7-8d9de59cd63d\") " pod="openshift-console/console-f9d7485db-58vm7" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.393029 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7vf5\" (UniqueName: \"kubernetes.io/projected/3b41acd5-2f6b-48fd-a9ba-55796e6db653-kube-api-access-x7vf5\") pod \"controller-manager-879f6c89f-dcccb\" (UID: \"3b41acd5-2f6b-48fd-a9ba-55796e6db653\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dcccb" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.393051 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2b10a22-bf51-42f1-82e7-23b09dc84d3a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-t44gr\" (UID: \"a2b10a22-bf51-42f1-82e7-23b09dc84d3a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t44gr" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.393067 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.393086 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwftl\" (UniqueName: \"kubernetes.io/projected/280cee64-7ec9-4dd4-9fb8-931b2d7a5818-kube-api-access-pwftl\") pod \"openshift-apiserver-operator-796bbdcf4f-kmrvp\" (UID: \"280cee64-7ec9-4dd4-9fb8-931b2d7a5818\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kmrvp" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.393107 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b41acd5-2f6b-48fd-a9ba-55796e6db653-client-ca\") pod \"controller-manager-879f6c89f-dcccb\" (UID: \"3b41acd5-2f6b-48fd-a9ba-55796e6db653\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dcccb" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.393132 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5047b83c-e577-419c-b2e0-05c6f7baaef3-machine-approver-tls\") pod \"machine-approver-56656f9798-tdnkt\" (UID: \"5047b83c-e577-419c-b2e0-05c6f7baaef3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdnkt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.393152 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1e546a4-808d-4c86-a8ab-274186b278a6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qvn7m\" (UID: \"b1e546a4-808d-4c86-a8ab-274186b278a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.393167 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.393183 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krkfq\" (UniqueName: \"kubernetes.io/projected/707bbe1d-eb9e-4d9d-8e70-e88429b8c077-kube-api-access-krkfq\") pod \"machine-api-operator-5694c8668f-9vfgz\" (UID: \"707bbe1d-eb9e-4d9d-8e70-e88429b8c077\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9vfgz" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.393197 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.393214 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/280cee64-7ec9-4dd4-9fb8-931b2d7a5818-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-kmrvp\" (UID: \"280cee64-7ec9-4dd4-9fb8-931b2d7a5818\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kmrvp" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.393229 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0fb104aa-1ec4-4941-acfb-8d8458df7d4a-images\") pod \"machine-config-operator-74547568cd-jrbwn\" (UID: \"0fb104aa-1ec4-4941-acfb-8d8458df7d4a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrbwn" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.393248 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b1e546a4-808d-4c86-a8ab-274186b278a6-encryption-config\") pod \"apiserver-7bbb656c7d-qvn7m\" (UID: \"b1e546a4-808d-4c86-a8ab-274186b278a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.393262 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-encryption-config\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.393277 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0bc940ed-4de2-4e88-98d7-8d9de59cd63d-oauth-serving-cert\") pod \"console-f9d7485db-58vm7\" (UID: \"0bc940ed-4de2-4e88-98d7-8d9de59cd63d\") " pod="openshift-console/console-f9d7485db-58vm7" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.393293 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/707bbe1d-eb9e-4d9d-8e70-e88429b8c077-config\") pod \"machine-api-operator-5694c8668f-9vfgz\" (UID: \"707bbe1d-eb9e-4d9d-8e70-e88429b8c077\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9vfgz" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.393308 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.393323 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9035860a-a3e9-439c-bfed-89b060a0bdc5-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fbm42\" (UID: \"9035860a-a3e9-439c-bfed-89b060a0bdc5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbm42" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.393338 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bxt5\" (UniqueName: \"kubernetes.io/projected/0bc940ed-4de2-4e88-98d7-8d9de59cd63d-kube-api-access-7bxt5\") pod \"console-f9d7485db-58vm7\" (UID: \"0bc940ed-4de2-4e88-98d7-8d9de59cd63d\") " pod="openshift-console/console-f9d7485db-58vm7" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.393354 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/707bbe1d-eb9e-4d9d-8e70-e88429b8c077-images\") pod \"machine-api-operator-5694c8668f-9vfgz\" (UID: \"707bbe1d-eb9e-4d9d-8e70-e88429b8c077\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9vfgz" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.393372 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.393390 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g8qn\" (UniqueName: \"kubernetes.io/projected/c36ce709-c726-4390-abb9-2ebcaecbf1c0-kube-api-access-4g8qn\") pod \"console-operator-58897d9998-49m8v\" (UID: \"c36ce709-c726-4390-abb9-2ebcaecbf1c0\") " pod="openshift-console-operator/console-operator-58897d9998-49m8v" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.393405 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0fb104aa-1ec4-4941-acfb-8d8458df7d4a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jrbwn\" (UID: \"0fb104aa-1ec4-4941-acfb-8d8458df7d4a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrbwn" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.393421 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37da02ec-0d39-471d-93ea-5cca2236656d-serving-cert\") pod \"openshift-config-operator-7777fb866f-l4r96\" (UID: \"37da02ec-0d39-471d-93ea-5cca2236656d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4r96" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.393437 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0bc940ed-4de2-4e88-98d7-8d9de59cd63d-console-config\") pod \"console-f9d7485db-58vm7\" (UID: \"0bc940ed-4de2-4e88-98d7-8d9de59cd63d\") " pod="openshift-console/console-f9d7485db-58vm7" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.393455 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.393470 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jwjd\" (UniqueName: \"kubernetes.io/projected/37da02ec-0d39-471d-93ea-5cca2236656d-kube-api-access-5jwjd\") pod \"openshift-config-operator-7777fb866f-l4r96\" (UID: \"37da02ec-0d39-471d-93ea-5cca2236656d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4r96" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.394319 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0aaa88f-ecf9-47b0-9349-737e855a9ed4-service-ca-bundle\") pod \"authentication-operator-69f744f599-bsswj\" (UID: \"f0aaa88f-ecf9-47b0-9349-737e855a9ed4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bsswj" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.394498 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-node-pullsecrets\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.395451 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b1e546a4-808d-4c86-a8ab-274186b278a6-audit-policies\") pod \"apiserver-7bbb656c7d-qvn7m\" (UID: \"b1e546a4-808d-4c86-a8ab-274186b278a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.398368 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2b10a22-bf51-42f1-82e7-23b09dc84d3a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-t44gr\" (UID: \"a2b10a22-bf51-42f1-82e7-23b09dc84d3a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t44gr" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.399216 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-config\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.399290 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b1e546a4-808d-4c86-a8ab-274186b278a6-audit-dir\") pod \"apiserver-7bbb656c7d-qvn7m\" (UID: \"b1e546a4-808d-4c86-a8ab-274186b278a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.403837 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7d7baf9-5adb-4cc1-a07b-d093bcd78e30-client-ca\") pod \"route-controller-manager-6576b87f9c-j55ft\" (UID: \"c7d7baf9-5adb-4cc1-a07b-d093bcd78e30\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j55ft" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.404602 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/707bbe1d-eb9e-4d9d-8e70-e88429b8c077-images\") pod \"machine-api-operator-5694c8668f-9vfgz\" (UID: \"707bbe1d-eb9e-4d9d-8e70-e88429b8c077\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9vfgz" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.407715 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b41acd5-2f6b-48fd-a9ba-55796e6db653-config\") pod \"controller-manager-879f6c89f-dcccb\" (UID: \"3b41acd5-2f6b-48fd-a9ba-55796e6db653\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dcccb" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.408236 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5047b83c-e577-419c-b2e0-05c6f7baaef3-config\") pod \"machine-approver-56656f9798-tdnkt\" (UID: \"5047b83c-e577-419c-b2e0-05c6f7baaef3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdnkt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.408772 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-etcd-serving-ca\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.409043 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b41acd5-2f6b-48fd-a9ba-55796e6db653-serving-cert\") pod \"controller-manager-879f6c89f-dcccb\" (UID: \"3b41acd5-2f6b-48fd-a9ba-55796e6db653\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dcccb" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.409220 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b1e546a4-808d-4c86-a8ab-274186b278a6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qvn7m\" (UID: \"b1e546a4-808d-4c86-a8ab-274186b278a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.410033 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.410359 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/37da02ec-0d39-471d-93ea-5cca2236656d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-l4r96\" (UID: \"37da02ec-0d39-471d-93ea-5cca2236656d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4r96" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.411306 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-9wdtp"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.411423 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7d7baf9-5adb-4cc1-a07b-d093bcd78e30-config\") pod \"route-controller-manager-6576b87f9c-j55ft\" (UID: \"c7d7baf9-5adb-4cc1-a07b-d093bcd78e30\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j55ft" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.412790 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-j55ft"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.412817 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/707bbe1d-eb9e-4d9d-8e70-e88429b8c077-config\") pod \"machine-api-operator-5694c8668f-9vfgz\" (UID: \"707bbe1d-eb9e-4d9d-8e70-e88429b8c077\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9vfgz" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.412823 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-stshz"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.412906 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-9wdtp" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.413188 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0aaa88f-ecf9-47b0-9349-737e855a9ed4-serving-cert\") pod \"authentication-operator-69f744f599-bsswj\" (UID: \"f0aaa88f-ecf9-47b0-9349-737e855a9ed4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bsswj" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.413322 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-stshz" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.413551 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.415149 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.415884 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5047b83c-e577-419c-b2e0-05c6f7baaef3-auth-proxy-config\") pod \"machine-approver-56656f9798-tdnkt\" (UID: \"5047b83c-e577-419c-b2e0-05c6f7baaef3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdnkt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.417886 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9035860a-a3e9-439c-bfed-89b060a0bdc5-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fbm42\" (UID: \"9035860a-a3e9-439c-bfed-89b060a0bdc5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbm42" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.418251 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37da02ec-0d39-471d-93ea-5cca2236656d-serving-cert\") pod \"openshift-config-operator-7777fb866f-l4r96\" (UID: \"37da02ec-0d39-471d-93ea-5cca2236656d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4r96" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.418530 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.423535 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c36ce709-c726-4390-abb9-2ebcaecbf1c0-config\") pod \"console-operator-58897d9998-49m8v\" (UID: \"c36ce709-c726-4390-abb9-2ebcaecbf1c0\") " pod="openshift-console-operator/console-operator-58897d9998-49m8v" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.423882 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-audit-dir\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.423960 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/497b9208-4958-46e8-8aeb-8bc2e0f172d6-audit-dir\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.423692 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.425314 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2b10a22-bf51-42f1-82e7-23b09dc84d3a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-t44gr\" (UID: \"a2b10a22-bf51-42f1-82e7-23b09dc84d3a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t44gr" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.425567 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.425594 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.425799 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.425980 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-audit\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.426286 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/497b9208-4958-46e8-8aeb-8bc2e0f172d6-audit-policies\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.426439 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b41acd5-2f6b-48fd-a9ba-55796e6db653-client-ca\") pod \"controller-manager-879f6c89f-dcccb\" (UID: \"3b41acd5-2f6b-48fd-a9ba-55796e6db653\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dcccb" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.426911 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1e546a4-808d-4c86-a8ab-274186b278a6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qvn7m\" (UID: \"b1e546a4-808d-4c86-a8ab-274186b278a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.427334 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0aaa88f-ecf9-47b0-9349-737e855a9ed4-config\") pod \"authentication-operator-69f744f599-bsswj\" (UID: \"f0aaa88f-ecf9-47b0-9349-737e855a9ed4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bsswj" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.428121 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b1e546a4-808d-4c86-a8ab-274186b278a6-encryption-config\") pod \"apiserver-7bbb656c7d-qvn7m\" (UID: \"b1e546a4-808d-4c86-a8ab-274186b278a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.428962 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.430474 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.432826 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b1e546a4-808d-4c86-a8ab-274186b278a6-etcd-client\") pod \"apiserver-7bbb656c7d-qvn7m\" (UID: \"b1e546a4-808d-4c86-a8ab-274186b278a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.433092 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.433695 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c36ce709-c726-4390-abb9-2ebcaecbf1c0-serving-cert\") pod \"console-operator-58897d9998-49m8v\" (UID: \"c36ce709-c726-4390-abb9-2ebcaecbf1c0\") " pod="openshift-console-operator/console-operator-58897d9998-49m8v" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.434445 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.436074 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/280cee64-7ec9-4dd4-9fb8-931b2d7a5818-config\") pod \"openshift-apiserver-operator-796bbdcf4f-kmrvp\" (UID: \"280cee64-7ec9-4dd4-9fb8-931b2d7a5818\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kmrvp" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.436601 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbm42"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.442269 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.448257 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/707bbe1d-eb9e-4d9d-8e70-e88429b8c077-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9vfgz\" (UID: \"707bbe1d-eb9e-4d9d-8e70-e88429b8c077\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9vfgz" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.450460 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.453286 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1e546a4-808d-4c86-a8ab-274186b278a6-serving-cert\") pod \"apiserver-7bbb656c7d-qvn7m\" (UID: \"b1e546a4-808d-4c86-a8ab-274186b278a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.453714 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.455808 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5047b83c-e577-419c-b2e0-05c6f7baaef3-machine-approver-tls\") pod \"machine-approver-56656f9798-tdnkt\" (UID: \"5047b83c-e577-419c-b2e0-05c6f7baaef3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdnkt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.455956 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/280cee64-7ec9-4dd4-9fb8-931b2d7a5818-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-kmrvp\" (UID: \"280cee64-7ec9-4dd4-9fb8-931b2d7a5818\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kmrvp" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.456170 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-encryption-config\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.456255 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-l4r96"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.456876 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.457491 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hxqzd"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.458202 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c36ce709-c726-4390-abb9-2ebcaecbf1c0-trusted-ca\") pod \"console-operator-58897d9998-49m8v\" (UID: \"c36ce709-c726-4390-abb9-2ebcaecbf1c0\") " pod="openshift-console-operator/console-operator-58897d9998-49m8v" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.458242 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hxqzd" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.458431 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-64249"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.459087 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.461143 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-64249" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.462487 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-serving-cert\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.462582 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-etcd-client\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.462817 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.465291 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.465357 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4lstm"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.465930 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kmrvp"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.466004 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4lstm" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.466553 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0aaa88f-ecf9-47b0-9349-737e855a9ed4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bsswj\" (UID: \"f0aaa88f-ecf9-47b0-9349-737e855a9ed4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bsswj" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.468305 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.468957 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-image-import-ca\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.472359 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.473916 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.485459 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.485808 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hklhr"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.489389 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hklhr" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.490434 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.490703 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.493868 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.494411 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b24"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.495206 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/99ab1bd5-e727-47e3-8f77-78f5a5795c7c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gf64p\" (UID: \"99ab1bd5-e727-47e3-8f77-78f5a5795c7c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gf64p" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.495273 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bc940ed-4de2-4e88-98d7-8d9de59cd63d-console-serving-cert\") pod \"console-f9d7485db-58vm7\" (UID: \"0bc940ed-4de2-4e88-98d7-8d9de59cd63d\") " pod="openshift-console/console-f9d7485db-58vm7" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.495324 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0fb104aa-1ec4-4941-acfb-8d8458df7d4a-images\") pod \"machine-config-operator-74547568cd-jrbwn\" (UID: \"0fb104aa-1ec4-4941-acfb-8d8458df7d4a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrbwn" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.495359 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0bc940ed-4de2-4e88-98d7-8d9de59cd63d-oauth-serving-cert\") pod \"console-f9d7485db-58vm7\" (UID: \"0bc940ed-4de2-4e88-98d7-8d9de59cd63d\") " pod="openshift-console/console-f9d7485db-58vm7" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.495383 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bxt5\" (UniqueName: \"kubernetes.io/projected/0bc940ed-4de2-4e88-98d7-8d9de59cd63d-kube-api-access-7bxt5\") pod \"console-f9d7485db-58vm7\" (UID: \"0bc940ed-4de2-4e88-98d7-8d9de59cd63d\") " pod="openshift-console/console-f9d7485db-58vm7" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.495416 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0fb104aa-1ec4-4941-acfb-8d8458df7d4a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jrbwn\" (UID: \"0fb104aa-1ec4-4941-acfb-8d8458df7d4a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrbwn" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.495441 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0bc940ed-4de2-4e88-98d7-8d9de59cd63d-console-config\") pod \"console-f9d7485db-58vm7\" (UID: \"0bc940ed-4de2-4e88-98d7-8d9de59cd63d\") " pod="openshift-console/console-f9d7485db-58vm7" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.495473 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjmvc\" (UniqueName: \"kubernetes.io/projected/99ab1bd5-e727-47e3-8f77-78f5a5795c7c-kube-api-access-hjmvc\") pod \"ingress-operator-5b745b69d9-gf64p\" (UID: \"99ab1bd5-e727-47e3-8f77-78f5a5795c7c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gf64p" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.495502 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bc940ed-4de2-4e88-98d7-8d9de59cd63d-trusted-ca-bundle\") pod \"console-f9d7485db-58vm7\" (UID: \"0bc940ed-4de2-4e88-98d7-8d9de59cd63d\") " pod="openshift-console/console-f9d7485db-58vm7" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.495537 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fb104aa-1ec4-4941-acfb-8d8458df7d4a-proxy-tls\") pod \"machine-config-operator-74547568cd-jrbwn\" (UID: \"0fb104aa-1ec4-4941-acfb-8d8458df7d4a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrbwn" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.495603 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99ab1bd5-e727-47e3-8f77-78f5a5795c7c-metrics-tls\") pod \"ingress-operator-5b745b69d9-gf64p\" (UID: \"99ab1bd5-e727-47e3-8f77-78f5a5795c7c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gf64p" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.495659 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0bc940ed-4de2-4e88-98d7-8d9de59cd63d-service-ca\") pod \"console-f9d7485db-58vm7\" (UID: \"0bc940ed-4de2-4e88-98d7-8d9de59cd63d\") " pod="openshift-console/console-f9d7485db-58vm7" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.495656 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7d7baf9-5adb-4cc1-a07b-d093bcd78e30-serving-cert\") pod \"route-controller-manager-6576b87f9c-j55ft\" (UID: \"c7d7baf9-5adb-4cc1-a07b-d093bcd78e30\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j55ft" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.495682 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/99ab1bd5-e727-47e3-8f77-78f5a5795c7c-trusted-ca\") pod \"ingress-operator-5b745b69d9-gf64p\" (UID: \"99ab1bd5-e727-47e3-8f77-78f5a5795c7c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gf64p" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.495709 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rczk\" (UniqueName: \"kubernetes.io/projected/0fb104aa-1ec4-4941-acfb-8d8458df7d4a-kube-api-access-6rczk\") pod \"machine-config-operator-74547568cd-jrbwn\" (UID: \"0fb104aa-1ec4-4941-acfb-8d8458df7d4a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrbwn" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.495744 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0bc940ed-4de2-4e88-98d7-8d9de59cd63d-console-oauth-config\") pod \"console-f9d7485db-58vm7\" (UID: \"0bc940ed-4de2-4e88-98d7-8d9de59cd63d\") " pod="openshift-console/console-f9d7485db-58vm7" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.495798 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2hvm\" (UniqueName: \"kubernetes.io/projected/39c4f31f-248a-42ad-9aa0-166f036be3ac-kube-api-access-c2hvm\") pod \"migrator-59844c95c7-flgmc\" (UID: \"39c4f31f-248a-42ad-9aa0-166f036be3ac\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-flgmc" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.497626 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0bc940ed-4de2-4e88-98d7-8d9de59cd63d-console-config\") pod \"console-f9d7485db-58vm7\" (UID: \"0bc940ed-4de2-4e88-98d7-8d9de59cd63d\") " pod="openshift-console/console-f9d7485db-58vm7" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.498019 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bc940ed-4de2-4e88-98d7-8d9de59cd63d-trusted-ca-bundle\") pod \"console-f9d7485db-58vm7\" (UID: \"0bc940ed-4de2-4e88-98d7-8d9de59cd63d\") " pod="openshift-console/console-f9d7485db-58vm7" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.498158 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0bc940ed-4de2-4e88-98d7-8d9de59cd63d-oauth-serving-cert\") pod \"console-f9d7485db-58vm7\" (UID: \"0bc940ed-4de2-4e88-98d7-8d9de59cd63d\") " pod="openshift-console/console-f9d7485db-58vm7" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.498191 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0bc940ed-4de2-4e88-98d7-8d9de59cd63d-service-ca\") pod \"console-f9d7485db-58vm7\" (UID: \"0bc940ed-4de2-4e88-98d7-8d9de59cd63d\") " pod="openshift-console/console-f9d7485db-58vm7" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.498228 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-49m8v"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.499137 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0fb104aa-1ec4-4941-acfb-8d8458df7d4a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jrbwn\" (UID: \"0fb104aa-1ec4-4941-acfb-8d8458df7d4a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrbwn" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.499161 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b24" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.499625 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6htld"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.500357 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.500883 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6htld" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.501716 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-mzmcb"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.501549 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bc940ed-4de2-4e88-98d7-8d9de59cd63d-console-serving-cert\") pod \"console-f9d7485db-58vm7\" (UID: \"0bc940ed-4de2-4e88-98d7-8d9de59cd63d\") " pod="openshift-console/console-f9d7485db-58vm7" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.502390 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.503268 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0bc940ed-4de2-4e88-98d7-8d9de59cd63d-console-oauth-config\") pod \"console-f9d7485db-58vm7\" (UID: \"0bc940ed-4de2-4e88-98d7-8d9de59cd63d\") " pod="openshift-console/console-f9d7485db-58vm7" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.504241 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-jp259"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.504926 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3b41acd5-2f6b-48fd-a9ba-55796e6db653-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dcccb\" (UID: \"3b41acd5-2f6b-48fd-a9ba-55796e6db653\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dcccb" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.505055 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-jp259" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.505859 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537520-d72cz"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.506500 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-d72cz" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.507715 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537532-qwbxr"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.510341 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537532-qwbxr" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.510670 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.514598 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4d9s6"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.516463 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wnkn9"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.517685 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wnkn9" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.517916 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-sb9bc"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.519975 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t44gr"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.521289 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-mvt85"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.522715 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-mvt85" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.526134 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9vfgz"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.527588 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8p4zs"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.529017 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-dsh4l"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.530235 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-stshz"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.530489 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.534524 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dcccb"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.535872 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jrbwn"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.536914 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-flgmc"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.538101 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.540898 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-x4ltm"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.542308 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bsswj"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.543704 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537532-qwbxr"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.545072 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbf7l"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.546097 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b24"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.547189 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wl8mf"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.548423 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gf64p"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.549787 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-58vm7"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.551191 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n59st"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.551820 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.552535 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wnkn9"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.553841 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-64249"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.555196 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6htld"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.556920 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hxqzd"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.558199 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zc5mk"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.559466 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nglpx"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.560694 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hklhr"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.561745 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bw85j"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.562802 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4lstm"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.564053 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-4qfjg"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.565120 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-4qfjg" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.565198 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-md7mn"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.565951 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-md7mn" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.566555 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dsp7s"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.567978 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-mvt85"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.569513 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537520-d72cz"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.570696 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-jp259"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.571484 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.571847 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-md7mn"] Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.591596 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.614242 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.632992 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.670974 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.679808 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0fb104aa-1ec4-4941-acfb-8d8458df7d4a-images\") pod \"machine-config-operator-74547568cd-jrbwn\" (UID: \"0fb104aa-1ec4-4941-acfb-8d8458df7d4a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrbwn" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.691346 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.711395 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.724747 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fb104aa-1ec4-4941-acfb-8d8458df7d4a-proxy-tls\") pod \"machine-config-operator-74547568cd-jrbwn\" (UID: \"0fb104aa-1ec4-4941-acfb-8d8458df7d4a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrbwn" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.732334 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.751523 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.771525 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.791262 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.811392 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.831949 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.852326 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.871825 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.891524 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.911590 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.931118 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.950803 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.971779 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 28 04:13:13 crc kubenswrapper[5072]: I0228 04:13:13.992361 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.011880 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.032031 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.042273 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"8408a5bb7659a4df0e3e0d3e0fc77c8095c28553b7553bbf8d132af5e4f4f9b8"} Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.043970 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ed680e155cb56e66f5c155b7135063c4cc8ed39f07cec0e081a669381902ef00"} Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.044023 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c5eae2333df1863de1fe7d6c8b429b09b14d14435dee6ffcaf5e6ff33ee9b947"} Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.044143 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.045461 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"5c153dd28d68f87687de635a2db6b4390580e21aa16d1f451744c3be932a0ae3"} Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.045502 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"3d45b539a24a5dfdf09180a06a1f8afd7f33999252a4aacdd95335763080b441"} Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.052113 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.070722 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.091953 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.112041 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.132054 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.151527 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.171321 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.201704 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.209654 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/99ab1bd5-e727-47e3-8f77-78f5a5795c7c-trusted-ca\") pod \"ingress-operator-5b745b69d9-gf64p\" (UID: \"99ab1bd5-e727-47e3-8f77-78f5a5795c7c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gf64p" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.211921 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.222799 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99ab1bd5-e727-47e3-8f77-78f5a5795c7c-metrics-tls\") pod \"ingress-operator-5b745b69d9-gf64p\" (UID: \"99ab1bd5-e727-47e3-8f77-78f5a5795c7c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gf64p" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.231674 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.252375 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.307145 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jwjd\" (UniqueName: \"kubernetes.io/projected/37da02ec-0d39-471d-93ea-5cca2236656d-kube-api-access-5jwjd\") pod \"openshift-config-operator-7777fb866f-l4r96\" (UID: \"37da02ec-0d39-471d-93ea-5cca2236656d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4r96" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.331876 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rbdz\" (UniqueName: \"kubernetes.io/projected/dbdad8a2-b26c-4587-8a9c-cdf96b65c15f-kube-api-access-2rbdz\") pod \"downloads-7954f5f757-mzmcb\" (UID: \"dbdad8a2-b26c-4587-8a9c-cdf96b65c15f\") " pod="openshift-console/downloads-7954f5f757-mzmcb" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.350796 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-mzmcb" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.364417 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkwwt\" (UniqueName: \"kubernetes.io/projected/f0aaa88f-ecf9-47b0-9349-737e855a9ed4-kube-api-access-hkwwt\") pod \"authentication-operator-69f744f599-bsswj\" (UID: \"f0aaa88f-ecf9-47b0-9349-737e855a9ed4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bsswj" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.367225 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7fgw\" (UniqueName: \"kubernetes.io/projected/497b9208-4958-46e8-8aeb-8bc2e0f172d6-kube-api-access-p7fgw\") pod \"oauth-openshift-558db77b4-4d9s6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.372612 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.394140 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2vsq\" (UniqueName: \"kubernetes.io/projected/5047b83c-e577-419c-b2e0-05c6f7baaef3-kube-api-access-w2vsq\") pod \"machine-approver-56656f9798-tdnkt\" (UID: \"5047b83c-e577-419c-b2e0-05c6f7baaef3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdnkt" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.410674 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdhtf\" (UniqueName: \"kubernetes.io/projected/c7d7baf9-5adb-4cc1-a07b-d093bcd78e30-kube-api-access-pdhtf\") pod \"route-controller-manager-6576b87f9c-j55ft\" (UID: \"c7d7baf9-5adb-4cc1-a07b-d093bcd78e30\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j55ft" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.429630 5072 request.go:700] Waited for 1.010812327s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver/serviceaccounts/openshift-apiserver-sa/token Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.432843 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ld8g\" (UniqueName: \"kubernetes.io/projected/9035860a-a3e9-439c-bfed-89b060a0bdc5-kube-api-access-2ld8g\") pod \"cluster-samples-operator-665b6dd947-fbm42\" (UID: \"9035860a-a3e9-439c-bfed-89b060a0bdc5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbm42" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.455347 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clzpw\" (UniqueName: \"kubernetes.io/projected/c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8-kube-api-access-clzpw\") pod \"apiserver-76f77b778f-sb9bc\" (UID: \"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8\") " pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.465438 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdnkt" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.470156 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjcdw\" (UniqueName: \"kubernetes.io/projected/b1e546a4-808d-4c86-a8ab-274186b278a6-kube-api-access-sjcdw\") pod \"apiserver-7bbb656c7d-qvn7m\" (UID: \"b1e546a4-808d-4c86-a8ab-274186b278a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.482245 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-bsswj" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.486248 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g8qn\" (UniqueName: \"kubernetes.io/projected/c36ce709-c726-4390-abb9-2ebcaecbf1c0-kube-api-access-4g8qn\") pod \"console-operator-58897d9998-49m8v\" (UID: \"c36ce709-c726-4390-abb9-2ebcaecbf1c0\") " pod="openshift-console-operator/console-operator-58897d9998-49m8v" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.500796 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j55ft" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.523431 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7vf5\" (UniqueName: \"kubernetes.io/projected/3b41acd5-2f6b-48fd-a9ba-55796e6db653-kube-api-access-x7vf5\") pod \"controller-manager-879f6c89f-dcccb\" (UID: \"3b41acd5-2f6b-48fd-a9ba-55796e6db653\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dcccb" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.527059 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4r96" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.540023 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.541556 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwftl\" (UniqueName: \"kubernetes.io/projected/280cee64-7ec9-4dd4-9fb8-931b2d7a5818-kube-api-access-pwftl\") pod \"openshift-apiserver-operator-796bbdcf4f-kmrvp\" (UID: \"280cee64-7ec9-4dd4-9fb8-931b2d7a5818\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kmrvp" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.542199 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.546022 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kmrvp" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.555946 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.561288 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-49m8v" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.564452 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-mzmcb"] Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.572251 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.579709 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.602275 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.603186 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4d9s6"] Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.615035 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.635303 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.652846 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.683256 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.691354 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.702348 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dcccb" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.703795 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbm42" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.722014 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.751581 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.757664 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krkfq\" (UniqueName: \"kubernetes.io/projected/707bbe1d-eb9e-4d9d-8e70-e88429b8c077-kube-api-access-krkfq\") pod \"machine-api-operator-5694c8668f-9vfgz\" (UID: \"707bbe1d-eb9e-4d9d-8e70-e88429b8c077\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9vfgz" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.774475 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.811869 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfsq4\" (UniqueName: \"kubernetes.io/projected/a2b10a22-bf51-42f1-82e7-23b09dc84d3a-kube-api-access-dfsq4\") pod \"openshift-controller-manager-operator-756b6f6bc6-t44gr\" (UID: \"a2b10a22-bf51-42f1-82e7-23b09dc84d3a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t44gr" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.812880 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.813858 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-9vfgz" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.834984 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.855214 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.876528 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t44gr" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.877750 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bsswj"] Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.877953 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.893127 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.912510 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.936142 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.941123 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-j55ft"] Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.954967 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.956569 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m"] Feb 28 04:13:14 crc kubenswrapper[5072]: I0228 04:13:14.972168 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.008307 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-sb9bc"] Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.011747 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2hvm\" (UniqueName: \"kubernetes.io/projected/39c4f31f-248a-42ad-9aa0-166f036be3ac-kube-api-access-c2hvm\") pod \"migrator-59844c95c7-flgmc\" (UID: \"39c4f31f-248a-42ad-9aa0-166f036be3ac\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-flgmc" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.026941 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjmvc\" (UniqueName: \"kubernetes.io/projected/99ab1bd5-e727-47e3-8f77-78f5a5795c7c-kube-api-access-hjmvc\") pod \"ingress-operator-5b745b69d9-gf64p\" (UID: \"99ab1bd5-e727-47e3-8f77-78f5a5795c7c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gf64p" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.047092 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bxt5\" (UniqueName: \"kubernetes.io/projected/0bc940ed-4de2-4e88-98d7-8d9de59cd63d-kube-api-access-7bxt5\") pod \"console-f9d7485db-58vm7\" (UID: \"0bc940ed-4de2-4e88-98d7-8d9de59cd63d\") " pod="openshift-console/console-f9d7485db-58vm7" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.065970 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/99ab1bd5-e727-47e3-8f77-78f5a5795c7c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gf64p\" (UID: \"99ab1bd5-e727-47e3-8f77-78f5a5795c7c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gf64p" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.068881 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" event={"ID":"b1e546a4-808d-4c86-a8ab-274186b278a6","Type":"ContainerStarted","Data":"ef4a6b5283e0c78dc2b144646f020f2fa16635c4b49a4b69bd32dfb70fd68e36"} Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.071407 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j55ft" event={"ID":"c7d7baf9-5adb-4cc1-a07b-d093bcd78e30","Type":"ContainerStarted","Data":"aa791c5076cac2053e8fd2feebde7a75347f36327bbf9e018ac6e11108bd3049"} Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.085886 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rczk\" (UniqueName: \"kubernetes.io/projected/0fb104aa-1ec4-4941-acfb-8d8458df7d4a-kube-api-access-6rczk\") pod \"machine-config-operator-74547568cd-jrbwn\" (UID: \"0fb104aa-1ec4-4941-acfb-8d8458df7d4a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrbwn" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.092037 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-bsswj" event={"ID":"f0aaa88f-ecf9-47b0-9349-737e855a9ed4","Type":"ContainerStarted","Data":"c53100af6a4581b812270cb4eb06689596bfbb6b5ce30963d68970fb278da9ac"} Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.095767 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.100986 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" event={"ID":"497b9208-4958-46e8-8aeb-8bc2e0f172d6","Type":"ContainerStarted","Data":"a65b19f5331f6726b3e322abd80c9079762ea319beea932246d37b929ef53c82"} Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.102204 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdnkt" event={"ID":"5047b83c-e577-419c-b2e0-05c6f7baaef3","Type":"ContainerStarted","Data":"61212213d7d0e143e936b2ba96a6a41f72bd60f32395c6e08318e4776adfd9c4"} Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.102228 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdnkt" event={"ID":"5047b83c-e577-419c-b2e0-05c6f7baaef3","Type":"ContainerStarted","Data":"d1b96ef9ad0ddb7a8fcd8244e9703eb109277a69e9ad53f87ac57403c5a4f285"} Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.105082 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mzmcb" event={"ID":"dbdad8a2-b26c-4587-8a9c-cdf96b65c15f","Type":"ContainerStarted","Data":"0f617812d3b7bfddf81ca14ab3361c59a5280e014914ea426b93432bbefe06e9"} Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.105115 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-mzmcb" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.105124 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mzmcb" event={"ID":"dbdad8a2-b26c-4587-8a9c-cdf96b65c15f","Type":"ContainerStarted","Data":"b55091ac5c6b83136caea9f543c583f8400f354e12c35847ca3c7caf1d877fb3"} Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.106063 5072 patch_prober.go:28] interesting pod/downloads-7954f5f757-mzmcb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.106108 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mzmcb" podUID="dbdad8a2-b26c-4587-8a9c-cdf96b65c15f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.111119 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.138924 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kmrvp"] Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.142080 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.152266 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.171398 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.191688 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.208334 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9vfgz"] Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.209093 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-58vm7" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.210813 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.232078 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.234450 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrbwn" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.247234 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-49m8v"] Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.251368 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-l4r96"] Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.255745 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.262772 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-flgmc" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.272265 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.274498 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t44gr"] Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.292019 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 28 04:13:15 crc kubenswrapper[5072]: W0228 04:13:15.293458 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2b10a22_bf51_42f1_82e7_23b09dc84d3a.slice/crio-42b640c90025946c4268fb417bec199f3bb9c8233c6eb1a86fe467a21d3de598 WatchSource:0}: Error finding container 42b640c90025946c4268fb417bec199f3bb9c8233c6eb1a86fe467a21d3de598: Status 404 returned error can't find the container with id 42b640c90025946c4268fb417bec199f3bb9c8233c6eb1a86fe467a21d3de598 Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.312623 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.332195 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.354271 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.356793 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gf64p" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.370748 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.374790 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dcccb"] Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.390566 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.410954 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbm42"] Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.422239 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.431402 5072 request.go:700] Waited for 1.911980689s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-default-metrics-tls&limit=500&resourceVersion=0 Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.449597 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.452433 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.474916 5072 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.491235 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.513739 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jrbwn"] Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.519081 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.531243 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.551299 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.571592 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.577218 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-flgmc"] Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.591290 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 28 04:13:15 crc kubenswrapper[5072]: W0228 04:13:15.601199 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39c4f31f_248a_42ad_9aa0_166f036be3ac.slice/crio-08c619beda1285f2967d5322b16795b71c5b983a53db422695f488598fb5ac47 WatchSource:0}: Error finding container 08c619beda1285f2967d5322b16795b71c5b983a53db422695f488598fb5ac47: Status 404 returned error can't find the container with id 08c619beda1285f2967d5322b16795b71c5b983a53db422695f488598fb5ac47 Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.610531 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.636806 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.708387 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gf64p"] Feb 28 04:13:15 crc kubenswrapper[5072]: W0228 04:13:15.726604 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99ab1bd5_e727_47e3_8f77_78f5a5795c7c.slice/crio-564f8fa56c82dfce0c1d1596d502cd9d19e295c9b18742c53d3833f2f7d40fb1 WatchSource:0}: Error finding container 564f8fa56c82dfce0c1d1596d502cd9d19e295c9b18742c53d3833f2f7d40fb1: Status 404 returned error can't find the container with id 564f8fa56c82dfce0c1d1596d502cd9d19e295c9b18742c53d3833f2f7d40fb1 Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.764529 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3b94a919-0f97-48a8-aac9-4f52655d572d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.764595 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78753d91-e30f-43df-8c10-5f8a978a755f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n59st\" (UID: \"78753d91-e30f-43df-8c10-5f8a978a755f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n59st" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.764711 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e440ea37-6f17-4863-a958-9e4b9debe3e3-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dsp7s\" (UID: \"e440ea37-6f17-4863-a958-9e4b9debe3e3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dsp7s" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.764778 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5vtg\" (UniqueName: \"kubernetes.io/projected/96f3be9c-c37e-4f19-b218-a9e6f8461d02-kube-api-access-x5vtg\") pod \"cluster-image-registry-operator-dc59b4c8b-8p4zs\" (UID: \"96f3be9c-c37e-4f19-b218-a9e6f8461d02\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8p4zs" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.764865 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b94a919-0f97-48a8-aac9-4f52655d572d-registry-tls\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.764888 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec-etcd-ca\") pod \"etcd-operator-b45778765-x4ltm\" (UID: \"9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x4ltm" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.764960 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec-serving-cert\") pod \"etcd-operator-b45778765-x4ltm\" (UID: \"9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x4ltm" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.765028 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec-etcd-client\") pod \"etcd-operator-b45778765-x4ltm\" (UID: \"9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x4ltm" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.765051 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qn7z\" (UniqueName: \"kubernetes.io/projected/281cab3a-f295-45d8-90f6-7e010d5daea5-kube-api-access-6qn7z\") pod \"dns-operator-744455d44c-dsh4l\" (UID: \"281cab3a-f295-45d8-90f6-7e010d5daea5\") " pod="openshift-dns-operator/dns-operator-744455d44c-dsh4l" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.765074 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/630fc50d-827c-4e60-87ad-aa5012ecbcd8-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-wl8mf\" (UID: \"630fc50d-827c-4e60-87ad-aa5012ecbcd8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wl8mf" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.765100 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/96f3be9c-c37e-4f19-b218-a9e6f8461d02-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8p4zs\" (UID: \"96f3be9c-c37e-4f19-b218-a9e6f8461d02\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8p4zs" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.765139 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b94a919-0f97-48a8-aac9-4f52655d572d-bound-sa-token\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.765165 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdabe9ca-055c-48ec-96c3-fb1ab2a342d5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-rbf7l\" (UID: \"bdabe9ca-055c-48ec-96c3-fb1ab2a342d5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbf7l" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.765201 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckz6z\" (UniqueName: \"kubernetes.io/projected/bdabe9ca-055c-48ec-96c3-fb1ab2a342d5-kube-api-access-ckz6z\") pod \"kube-storage-version-migrator-operator-b67b599dd-rbf7l\" (UID: \"bdabe9ca-055c-48ec-96c3-fb1ab2a342d5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbf7l" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.765235 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3b94a919-0f97-48a8-aac9-4f52655d572d-trusted-ca\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.765259 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e440ea37-6f17-4863-a958-9e4b9debe3e3-proxy-tls\") pod \"machine-config-controller-84d6567774-dsp7s\" (UID: \"e440ea37-6f17-4863-a958-9e4b9debe3e3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dsp7s" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.765369 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4003ab4-f058-48d2-835e-e50ecd38cebb-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nglpx\" (UID: \"f4003ab4-f058-48d2-835e-e50ecd38cebb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nglpx" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.765391 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78753d91-e30f-43df-8c10-5f8a978a755f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n59st\" (UID: \"78753d91-e30f-43df-8c10-5f8a978a755f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n59st" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.765451 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/672db961-8de6-46ec-9dd8-5d2ef7572eef-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-zc5mk\" (UID: \"672db961-8de6-46ec-9dd8-5d2ef7572eef\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zc5mk" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.765539 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.765609 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/630fc50d-827c-4e60-87ad-aa5012ecbcd8-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-wl8mf\" (UID: \"630fc50d-827c-4e60-87ad-aa5012ecbcd8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wl8mf" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.765634 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/96f3be9c-c37e-4f19-b218-a9e6f8461d02-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8p4zs\" (UID: \"96f3be9c-c37e-4f19-b218-a9e6f8461d02\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8p4zs" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.765677 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78753d91-e30f-43df-8c10-5f8a978a755f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n59st\" (UID: \"78753d91-e30f-43df-8c10-5f8a978a755f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n59st" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.765745 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4003ab4-f058-48d2-835e-e50ecd38cebb-config\") pod \"kube-controller-manager-operator-78b949d7b-nglpx\" (UID: \"f4003ab4-f058-48d2-835e-e50ecd38cebb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nglpx" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.765769 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/630fc50d-827c-4e60-87ad-aa5012ecbcd8-config\") pod \"kube-apiserver-operator-766d6c64bb-wl8mf\" (UID: \"630fc50d-827c-4e60-87ad-aa5012ecbcd8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wl8mf" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.765880 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4k4q\" (UniqueName: \"kubernetes.io/projected/3b94a919-0f97-48a8-aac9-4f52655d572d-kube-api-access-w4k4q\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.765964 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec-config\") pod \"etcd-operator-b45778765-x4ltm\" (UID: \"9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x4ltm" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.768238 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdabe9ca-055c-48ec-96c3-fb1ab2a342d5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-rbf7l\" (UID: \"bdabe9ca-055c-48ec-96c3-fb1ab2a342d5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbf7l" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.768270 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4003ab4-f058-48d2-835e-e50ecd38cebb-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nglpx\" (UID: \"f4003ab4-f058-48d2-835e-e50ecd38cebb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nglpx" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.768296 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/281cab3a-f295-45d8-90f6-7e010d5daea5-metrics-tls\") pod \"dns-operator-744455d44c-dsh4l\" (UID: \"281cab3a-f295-45d8-90f6-7e010d5daea5\") " pod="openshift-dns-operator/dns-operator-744455d44c-dsh4l" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.768529 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnlvr\" (UniqueName: \"kubernetes.io/projected/9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec-kube-api-access-bnlvr\") pod \"etcd-operator-b45778765-x4ltm\" (UID: \"9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x4ltm" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.768564 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lmcz\" (UniqueName: \"kubernetes.io/projected/e440ea37-6f17-4863-a958-9e4b9debe3e3-kube-api-access-2lmcz\") pod \"machine-config-controller-84d6567774-dsp7s\" (UID: \"e440ea37-6f17-4863-a958-9e4b9debe3e3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dsp7s" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.768673 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3b94a919-0f97-48a8-aac9-4f52655d572d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:15 crc kubenswrapper[5072]: E0228 04:13:15.769386 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:16.269367844 +0000 UTC m=+218.264098236 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.769787 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktfmb\" (UniqueName: \"kubernetes.io/projected/672db961-8de6-46ec-9dd8-5d2ef7572eef-kube-api-access-ktfmb\") pod \"control-plane-machine-set-operator-78cbb6b69f-zc5mk\" (UID: \"672db961-8de6-46ec-9dd8-5d2ef7572eef\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zc5mk" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.771052 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3b94a919-0f97-48a8-aac9-4f52655d572d-registry-certificates\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.771373 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/96f3be9c-c37e-4f19-b218-a9e6f8461d02-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8p4zs\" (UID: \"96f3be9c-c37e-4f19-b218-a9e6f8461d02\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8p4zs" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.771462 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec-etcd-service-ca\") pod \"etcd-operator-b45778765-x4ltm\" (UID: \"9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x4ltm" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.802319 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-58vm7"] Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.872475 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:15 crc kubenswrapper[5072]: E0228 04:13:15.872633 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:16.37261023 +0000 UTC m=+218.367340422 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.872679 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7f7f662b-a78e-4292-871f-e1c6cbfca641-signing-key\") pod \"service-ca-9c57cc56f-jp259\" (UID: \"7f7f662b-a78e-4292-871f-e1c6cbfca641\") " pod="openshift-service-ca/service-ca-9c57cc56f-jp259" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.872704 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/49976b40-ea77-4857-a99f-f4a65df82e05-default-certificate\") pod \"router-default-5444994796-9wdtp\" (UID: \"49976b40-ea77-4857-a99f-f4a65df82e05\") " pod="openshift-ingress/router-default-5444994796-9wdtp" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.872752 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b94a919-0f97-48a8-aac9-4f52655d572d-registry-tls\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.872769 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec-etcd-ca\") pod \"etcd-operator-b45778765-x4ltm\" (UID: \"9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x4ltm" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.872786 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec-serving-cert\") pod \"etcd-operator-b45778765-x4ltm\" (UID: \"9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x4ltm" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.872831 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq2t7\" (UniqueName: \"kubernetes.io/projected/0caa7e2d-d3a6-4df8-a734-0d607ec64f5c-kube-api-access-sq2t7\") pod \"csi-hostpathplugin-mvt85\" (UID: \"0caa7e2d-d3a6-4df8-a734-0d607ec64f5c\") " pod="hostpath-provisioner/csi-hostpathplugin-mvt85" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.872852 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnmkz\" (UniqueName: \"kubernetes.io/projected/3644bb59-5f48-4f4d-b244-d5128487ff5f-kube-api-access-jnmkz\") pod \"olm-operator-6b444d44fb-4lstm\" (UID: \"3644bb59-5f48-4f4d-b244-d5128487ff5f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4lstm" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.872918 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49976b40-ea77-4857-a99f-f4a65df82e05-metrics-certs\") pod \"router-default-5444994796-9wdtp\" (UID: \"49976b40-ea77-4857-a99f-f4a65df82e05\") " pod="openshift-ingress/router-default-5444994796-9wdtp" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.872934 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec-etcd-client\") pod \"etcd-operator-b45778765-x4ltm\" (UID: \"9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x4ltm" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.872950 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qn7z\" (UniqueName: \"kubernetes.io/projected/281cab3a-f295-45d8-90f6-7e010d5daea5-kube-api-access-6qn7z\") pod \"dns-operator-744455d44c-dsh4l\" (UID: \"281cab3a-f295-45d8-90f6-7e010d5daea5\") " pod="openshift-dns-operator/dns-operator-744455d44c-dsh4l" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.872968 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/630fc50d-827c-4e60-87ad-aa5012ecbcd8-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-wl8mf\" (UID: \"630fc50d-827c-4e60-87ad-aa5012ecbcd8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wl8mf" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.872983 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3644bb59-5f48-4f4d-b244-d5128487ff5f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-4lstm\" (UID: \"3644bb59-5f48-4f4d-b244-d5128487ff5f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4lstm" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.873010 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/96f3be9c-c37e-4f19-b218-a9e6f8461d02-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8p4zs\" (UID: \"96f3be9c-c37e-4f19-b218-a9e6f8461d02\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8p4zs" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.873026 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d8d6896e-b13d-4c3b-b0e6-7feb24127794-webhook-cert\") pod \"packageserver-d55dfcdfc-q7b24\" (UID: \"d8d6896e-b13d-4c3b-b0e6-7feb24127794\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b24" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.873044 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0caa7e2d-d3a6-4df8-a734-0d607ec64f5c-csi-data-dir\") pod \"csi-hostpathplugin-mvt85\" (UID: \"0caa7e2d-d3a6-4df8-a734-0d607ec64f5c\") " pod="hostpath-provisioner/csi-hostpathplugin-mvt85" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.873059 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d8d6896e-b13d-4c3b-b0e6-7feb24127794-apiservice-cert\") pod \"packageserver-d55dfcdfc-q7b24\" (UID: \"d8d6896e-b13d-4c3b-b0e6-7feb24127794\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b24" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.873075 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b94a919-0f97-48a8-aac9-4f52655d572d-bound-sa-token\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.873091 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdabe9ca-055c-48ec-96c3-fb1ab2a342d5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-rbf7l\" (UID: \"bdabe9ca-055c-48ec-96c3-fb1ab2a342d5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbf7l" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.873116 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3b94a919-0f97-48a8-aac9-4f52655d572d-trusted-ca\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.873136 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckz6z\" (UniqueName: \"kubernetes.io/projected/bdabe9ca-055c-48ec-96c3-fb1ab2a342d5-kube-api-access-ckz6z\") pod \"kube-storage-version-migrator-operator-b67b599dd-rbf7l\" (UID: \"bdabe9ca-055c-48ec-96c3-fb1ab2a342d5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbf7l" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.873155 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e440ea37-6f17-4863-a958-9e4b9debe3e3-proxy-tls\") pod \"machine-config-controller-84d6567774-dsp7s\" (UID: \"e440ea37-6f17-4863-a958-9e4b9debe3e3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dsp7s" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.873173 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sg8f\" (UniqueName: \"kubernetes.io/projected/d8d6896e-b13d-4c3b-b0e6-7feb24127794-kube-api-access-8sg8f\") pod \"packageserver-d55dfcdfc-q7b24\" (UID: \"d8d6896e-b13d-4c3b-b0e6-7feb24127794\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b24" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.873200 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4003ab4-f058-48d2-835e-e50ecd38cebb-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nglpx\" (UID: \"f4003ab4-f058-48d2-835e-e50ecd38cebb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nglpx" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.873215 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78753d91-e30f-43df-8c10-5f8a978a755f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n59st\" (UID: \"78753d91-e30f-43df-8c10-5f8a978a755f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n59st" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.873239 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0caa7e2d-d3a6-4df8-a734-0d607ec64f5c-registration-dir\") pod \"csi-hostpathplugin-mvt85\" (UID: \"0caa7e2d-d3a6-4df8-a734-0d607ec64f5c\") " pod="hostpath-provisioner/csi-hostpathplugin-mvt85" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.873255 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/58b4223e-e786-4022-a470-4b3c6aa754dd-profile-collector-cert\") pod \"catalog-operator-68c6474976-hxqzd\" (UID: \"58b4223e-e786-4022-a470-4b3c6aa754dd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hxqzd" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.873273 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcbc6\" (UniqueName: \"kubernetes.io/projected/58b4223e-e786-4022-a470-4b3c6aa754dd-kube-api-access-xcbc6\") pod \"catalog-operator-68c6474976-hxqzd\" (UID: \"58b4223e-e786-4022-a470-4b3c6aa754dd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hxqzd" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.873294 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/672db961-8de6-46ec-9dd8-5d2ef7572eef-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-zc5mk\" (UID: \"672db961-8de6-46ec-9dd8-5d2ef7572eef\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zc5mk" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.873313 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3644bb59-5f48-4f4d-b244-d5128487ff5f-srv-cert\") pod \"olm-operator-6b444d44fb-4lstm\" (UID: \"3644bb59-5f48-4f4d-b244-d5128487ff5f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4lstm" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.873339 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/51561c1c-e376-4dc6-9429-b1bc39a54988-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-hklhr\" (UID: \"51561c1c-e376-4dc6-9429-b1bc39a54988\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hklhr" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.873367 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0caa7e2d-d3a6-4df8-a734-0d607ec64f5c-mountpoint-dir\") pod \"csi-hostpathplugin-mvt85\" (UID: \"0caa7e2d-d3a6-4df8-a734-0d607ec64f5c\") " pod="hostpath-provisioner/csi-hostpathplugin-mvt85" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.873383 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bd0cadda-3f93-43ac-b288-7e666a7f1b99-config-volume\") pod \"collect-profiles-29537520-d72cz\" (UID: \"bd0cadda-3f93-43ac-b288-7e666a7f1b99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-d72cz" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.873405 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.873462 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/96f3be9c-c37e-4f19-b218-a9e6f8461d02-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8p4zs\" (UID: \"96f3be9c-c37e-4f19-b218-a9e6f8461d02\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8p4zs" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.873480 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78753d91-e30f-43df-8c10-5f8a978a755f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n59st\" (UID: \"78753d91-e30f-43df-8c10-5f8a978a755f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n59st" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.873502 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/630fc50d-827c-4e60-87ad-aa5012ecbcd8-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-wl8mf\" (UID: \"630fc50d-827c-4e60-87ad-aa5012ecbcd8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wl8mf" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.873563 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4003ab4-f058-48d2-835e-e50ecd38cebb-config\") pod \"kube-controller-manager-operator-78b949d7b-nglpx\" (UID: \"f4003ab4-f058-48d2-835e-e50ecd38cebb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nglpx" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.873589 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec-etcd-ca\") pod \"etcd-operator-b45778765-x4ltm\" (UID: \"9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x4ltm" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.873601 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/630fc50d-827c-4e60-87ad-aa5012ecbcd8-config\") pod \"kube-apiserver-operator-766d6c64bb-wl8mf\" (UID: \"630fc50d-827c-4e60-87ad-aa5012ecbcd8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wl8mf" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.873632 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7f7f662b-a78e-4292-871f-e1c6cbfca641-signing-cabundle\") pod \"service-ca-9c57cc56f-jp259\" (UID: \"7f7f662b-a78e-4292-871f-e1c6cbfca641\") " pod="openshift-service-ca/service-ca-9c57cc56f-jp259" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.873669 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmxjs\" (UniqueName: \"kubernetes.io/projected/c5d29ab8-044b-4fc5-b5eb-02c5ac608dac-kube-api-access-kmxjs\") pod \"auto-csr-approver-29537532-qwbxr\" (UID: \"c5d29ab8-044b-4fc5-b5eb-02c5ac608dac\") " pod="openshift-infra/auto-csr-approver-29537532-qwbxr" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.874367 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78753d91-e30f-43df-8c10-5f8a978a755f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n59st\" (UID: \"78753d91-e30f-43df-8c10-5f8a978a755f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n59st" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.879563 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/630fc50d-827c-4e60-87ad-aa5012ecbcd8-config\") pod \"kube-apiserver-operator-766d6c64bb-wl8mf\" (UID: \"630fc50d-827c-4e60-87ad-aa5012ecbcd8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wl8mf" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.879982 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/630fc50d-827c-4e60-87ad-aa5012ecbcd8-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-wl8mf\" (UID: \"630fc50d-827c-4e60-87ad-aa5012ecbcd8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wl8mf" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.879978 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b94a919-0f97-48a8-aac9-4f52655d572d-registry-tls\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.888997 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3b94a919-0f97-48a8-aac9-4f52655d572d-trusted-ca\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.890428 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/96f3be9c-c37e-4f19-b218-a9e6f8461d02-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8p4zs\" (UID: \"96f3be9c-c37e-4f19-b218-a9e6f8461d02\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8p4zs" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.892444 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4003ab4-f058-48d2-835e-e50ecd38cebb-config\") pod \"kube-controller-manager-operator-78b949d7b-nglpx\" (UID: \"f4003ab4-f058-48d2-835e-e50ecd38cebb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nglpx" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.894216 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdabe9ca-055c-48ec-96c3-fb1ab2a342d5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-rbf7l\" (UID: \"bdabe9ca-055c-48ec-96c3-fb1ab2a342d5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbf7l" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.895256 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e440ea37-6f17-4863-a958-9e4b9debe3e3-proxy-tls\") pod \"machine-config-controller-84d6567774-dsp7s\" (UID: \"e440ea37-6f17-4863-a958-9e4b9debe3e3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dsp7s" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.898041 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsjmk\" (UniqueName: \"kubernetes.io/projected/5a1b591b-8355-4747-8858-baf81ce6928d-kube-api-access-dsjmk\") pod \"ingress-canary-md7mn\" (UID: \"5a1b591b-8355-4747-8858-baf81ce6928d\") " pod="openshift-ingress-canary/ingress-canary-md7mn" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.898131 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/58b4223e-e786-4022-a470-4b3c6aa754dd-srv-cert\") pod \"catalog-operator-68c6474976-hxqzd\" (UID: \"58b4223e-e786-4022-a470-4b3c6aa754dd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hxqzd" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.898202 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghwrz\" (UniqueName: \"kubernetes.io/projected/bd0cadda-3f93-43ac-b288-7e666a7f1b99-kube-api-access-ghwrz\") pod \"collect-profiles-29537520-d72cz\" (UID: \"bd0cadda-3f93-43ac-b288-7e666a7f1b99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-d72cz" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.899615 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2zjk\" (UniqueName: \"kubernetes.io/projected/c329b662-2d9c-4a89-b244-76d7cd9c5c5c-kube-api-access-g2zjk\") pod \"service-ca-operator-777779d784-6htld\" (UID: \"c329b662-2d9c-4a89-b244-76d7cd9c5c5c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6htld" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.899686 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/56c867eb-6fd4-476a-8317-9f590f2ff47a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-stshz\" (UID: \"56c867eb-6fd4-476a-8317-9f590f2ff47a\") " pod="openshift-marketplace/marketplace-operator-79b997595-stshz" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.906138 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec-serving-cert\") pod \"etcd-operator-b45778765-x4ltm\" (UID: \"9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x4ltm" Feb 28 04:13:15 crc kubenswrapper[5072]: E0228 04:13:15.906615 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:16.406580333 +0000 UTC m=+218.401310535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.919949 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4003ab4-f058-48d2-835e-e50ecd38cebb-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nglpx\" (UID: \"f4003ab4-f058-48d2-835e-e50ecd38cebb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nglpx" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.920339 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/672db961-8de6-46ec-9dd8-5d2ef7572eef-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-zc5mk\" (UID: \"672db961-8de6-46ec-9dd8-5d2ef7572eef\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zc5mk" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.924982 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec-etcd-client\") pod \"etcd-operator-b45778765-x4ltm\" (UID: \"9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x4ltm" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.945589 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq4zd\" (UniqueName: \"kubernetes.io/projected/51561c1c-e376-4dc6-9429-b1bc39a54988-kube-api-access-nq4zd\") pod \"package-server-manager-789f6589d5-hklhr\" (UID: \"51561c1c-e376-4dc6-9429-b1bc39a54988\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hklhr" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.946147 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4k4q\" (UniqueName: \"kubernetes.io/projected/3b94a919-0f97-48a8-aac9-4f52655d572d-kube-api-access-w4k4q\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.946272 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec-config\") pod \"etcd-operator-b45778765-x4ltm\" (UID: \"9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x4ltm" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.946888 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdabe9ca-055c-48ec-96c3-fb1ab2a342d5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-rbf7l\" (UID: \"bdabe9ca-055c-48ec-96c3-fb1ab2a342d5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbf7l" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.947152 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4003ab4-f058-48d2-835e-e50ecd38cebb-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nglpx\" (UID: \"f4003ab4-f058-48d2-835e-e50ecd38cebb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nglpx" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.947213 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/281cab3a-f295-45d8-90f6-7e010d5daea5-metrics-tls\") pod \"dns-operator-744455d44c-dsh4l\" (UID: \"281cab3a-f295-45d8-90f6-7e010d5daea5\") " pod="openshift-dns-operator/dns-operator-744455d44c-dsh4l" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.947246 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac478b24-16b1-460c-97d2-6be88f80ed94-config-volume\") pod \"dns-default-wnkn9\" (UID: \"ac478b24-16b1-460c-97d2-6be88f80ed94\") " pod="openshift-dns/dns-default-wnkn9" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.947284 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/908db76f-57be-45ae-9ebe-40d4e707fc5c-certs\") pod \"machine-config-server-4qfjg\" (UID: \"908db76f-57be-45ae-9ebe-40d4e707fc5c\") " pod="openshift-machine-config-operator/machine-config-server-4qfjg" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.947335 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnlvr\" (UniqueName: \"kubernetes.io/projected/9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec-kube-api-access-bnlvr\") pod \"etcd-operator-b45778765-x4ltm\" (UID: \"9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x4ltm" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.947361 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lmcz\" (UniqueName: \"kubernetes.io/projected/e440ea37-6f17-4863-a958-9e4b9debe3e3-kube-api-access-2lmcz\") pod \"machine-config-controller-84d6567774-dsp7s\" (UID: \"e440ea37-6f17-4863-a958-9e4b9debe3e3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dsp7s" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.947384 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz8hh\" (UniqueName: \"kubernetes.io/projected/908db76f-57be-45ae-9ebe-40d4e707fc5c-kube-api-access-lz8hh\") pod \"machine-config-server-4qfjg\" (UID: \"908db76f-57be-45ae-9ebe-40d4e707fc5c\") " pod="openshift-machine-config-operator/machine-config-server-4qfjg" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.947438 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3b94a919-0f97-48a8-aac9-4f52655d572d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.947464 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a1b591b-8355-4747-8858-baf81ce6928d-cert\") pod \"ingress-canary-md7mn\" (UID: \"5a1b591b-8355-4747-8858-baf81ce6928d\") " pod="openshift-ingress-canary/ingress-canary-md7mn" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.947491 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txdpk\" (UniqueName: \"kubernetes.io/projected/61d5e54f-0eeb-4b59-9511-cdf807911640-kube-api-access-txdpk\") pod \"multus-admission-controller-857f4d67dd-64249\" (UID: \"61d5e54f-0eeb-4b59-9511-cdf807911640\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-64249" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.947518 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h9t2\" (UniqueName: \"kubernetes.io/projected/ac478b24-16b1-460c-97d2-6be88f80ed94-kube-api-access-6h9t2\") pod \"dns-default-wnkn9\" (UID: \"ac478b24-16b1-460c-97d2-6be88f80ed94\") " pod="openshift-dns/dns-default-wnkn9" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.947529 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdabe9ca-055c-48ec-96c3-fb1ab2a342d5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-rbf7l\" (UID: \"bdabe9ca-055c-48ec-96c3-fb1ab2a342d5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbf7l" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.947541 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0caa7e2d-d3a6-4df8-a734-0d607ec64f5c-socket-dir\") pod \"csi-hostpathplugin-mvt85\" (UID: \"0caa7e2d-d3a6-4df8-a734-0d607ec64f5c\") " pod="hostpath-provisioner/csi-hostpathplugin-mvt85" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.947563 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/908db76f-57be-45ae-9ebe-40d4e707fc5c-node-bootstrap-token\") pod \"machine-config-server-4qfjg\" (UID: \"908db76f-57be-45ae-9ebe-40d4e707fc5c\") " pod="openshift-machine-config-operator/machine-config-server-4qfjg" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.947582 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac478b24-16b1-460c-97d2-6be88f80ed94-metrics-tls\") pod \"dns-default-wnkn9\" (UID: \"ac478b24-16b1-460c-97d2-6be88f80ed94\") " pod="openshift-dns/dns-default-wnkn9" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.947468 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec-config\") pod \"etcd-operator-b45778765-x4ltm\" (UID: \"9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x4ltm" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.947623 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjn5w\" (UniqueName: \"kubernetes.io/projected/56c867eb-6fd4-476a-8317-9f590f2ff47a-kube-api-access-hjn5w\") pod \"marketplace-operator-79b997595-stshz\" (UID: \"56c867eb-6fd4-476a-8317-9f590f2ff47a\") " pod="openshift-marketplace/marketplace-operator-79b997595-stshz" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.947682 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d8d6896e-b13d-4c3b-b0e6-7feb24127794-tmpfs\") pod \"packageserver-d55dfcdfc-q7b24\" (UID: \"d8d6896e-b13d-4c3b-b0e6-7feb24127794\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b24" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.947710 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/56c867eb-6fd4-476a-8317-9f590f2ff47a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-stshz\" (UID: \"56c867eb-6fd4-476a-8317-9f590f2ff47a\") " pod="openshift-marketplace/marketplace-operator-79b997595-stshz" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.947737 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z57rp\" (UniqueName: \"kubernetes.io/projected/49976b40-ea77-4857-a99f-f4a65df82e05-kube-api-access-z57rp\") pod \"router-default-5444994796-9wdtp\" (UID: \"49976b40-ea77-4857-a99f-f4a65df82e05\") " pod="openshift-ingress/router-default-5444994796-9wdtp" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.947865 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49976b40-ea77-4857-a99f-f4a65df82e05-service-ca-bundle\") pod \"router-default-5444994796-9wdtp\" (UID: \"49976b40-ea77-4857-a99f-f4a65df82e05\") " pod="openshift-ingress/router-default-5444994796-9wdtp" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.947915 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c329b662-2d9c-4a89-b244-76d7cd9c5c5c-serving-cert\") pod \"service-ca-operator-777779d784-6htld\" (UID: \"c329b662-2d9c-4a89-b244-76d7cd9c5c5c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6htld" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.947945 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktfmb\" (UniqueName: \"kubernetes.io/projected/672db961-8de6-46ec-9dd8-5d2ef7572eef-kube-api-access-ktfmb\") pod \"control-plane-machine-set-operator-78cbb6b69f-zc5mk\" (UID: \"672db961-8de6-46ec-9dd8-5d2ef7572eef\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zc5mk" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.947973 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h6z9\" (UniqueName: \"kubernetes.io/projected/7f7f662b-a78e-4292-871f-e1c6cbfca641-kube-api-access-4h6z9\") pod \"service-ca-9c57cc56f-jp259\" (UID: \"7f7f662b-a78e-4292-871f-e1c6cbfca641\") " pod="openshift-service-ca/service-ca-9c57cc56f-jp259" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.948004 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3b94a919-0f97-48a8-aac9-4f52655d572d-registry-certificates\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.948050 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0caa7e2d-d3a6-4df8-a734-0d607ec64f5c-plugins-dir\") pod \"csi-hostpathplugin-mvt85\" (UID: \"0caa7e2d-d3a6-4df8-a734-0d607ec64f5c\") " pod="hostpath-provisioner/csi-hostpathplugin-mvt85" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.948076 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/49976b40-ea77-4857-a99f-f4a65df82e05-stats-auth\") pod \"router-default-5444994796-9wdtp\" (UID: \"49976b40-ea77-4857-a99f-f4a65df82e05\") " pod="openshift-ingress/router-default-5444994796-9wdtp" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.948102 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/96f3be9c-c37e-4f19-b218-a9e6f8461d02-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8p4zs\" (UID: \"96f3be9c-c37e-4f19-b218-a9e6f8461d02\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8p4zs" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.948145 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/61d5e54f-0eeb-4b59-9511-cdf807911640-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-64249\" (UID: \"61d5e54f-0eeb-4b59-9511-cdf807911640\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-64249" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.948163 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bd0cadda-3f93-43ac-b288-7e666a7f1b99-secret-volume\") pod \"collect-profiles-29537520-d72cz\" (UID: \"bd0cadda-3f93-43ac-b288-7e666a7f1b99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-d72cz" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.950256 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3b94a919-0f97-48a8-aac9-4f52655d572d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.952919 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3b94a919-0f97-48a8-aac9-4f52655d572d-registry-certificates\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.953072 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec-etcd-service-ca\") pod \"etcd-operator-b45778765-x4ltm\" (UID: \"9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x4ltm" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.953805 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec-etcd-service-ca\") pod \"etcd-operator-b45778765-x4ltm\" (UID: \"9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x4ltm" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.953821 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c329b662-2d9c-4a89-b244-76d7cd9c5c5c-config\") pod \"service-ca-operator-777779d784-6htld\" (UID: \"c329b662-2d9c-4a89-b244-76d7cd9c5c5c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6htld" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.954145 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78753d91-e30f-43df-8c10-5f8a978a755f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n59st\" (UID: \"78753d91-e30f-43df-8c10-5f8a978a755f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n59st" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.954358 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3b94a919-0f97-48a8-aac9-4f52655d572d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.954438 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e440ea37-6f17-4863-a958-9e4b9debe3e3-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dsp7s\" (UID: \"e440ea37-6f17-4863-a958-9e4b9debe3e3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dsp7s" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.954544 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5vtg\" (UniqueName: \"kubernetes.io/projected/96f3be9c-c37e-4f19-b218-a9e6f8461d02-kube-api-access-x5vtg\") pod \"cluster-image-registry-operator-dc59b4c8b-8p4zs\" (UID: \"96f3be9c-c37e-4f19-b218-a9e6f8461d02\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8p4zs" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.954812 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/96f3be9c-c37e-4f19-b218-a9e6f8461d02-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8p4zs\" (UID: \"96f3be9c-c37e-4f19-b218-a9e6f8461d02\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8p4zs" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.955289 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e440ea37-6f17-4863-a958-9e4b9debe3e3-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dsp7s\" (UID: \"e440ea37-6f17-4863-a958-9e4b9debe3e3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dsp7s" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.962792 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qn7z\" (UniqueName: \"kubernetes.io/projected/281cab3a-f295-45d8-90f6-7e010d5daea5-kube-api-access-6qn7z\") pod \"dns-operator-744455d44c-dsh4l\" (UID: \"281cab3a-f295-45d8-90f6-7e010d5daea5\") " pod="openshift-dns-operator/dns-operator-744455d44c-dsh4l" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.970391 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3b94a919-0f97-48a8-aac9-4f52655d572d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.970758 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/630fc50d-827c-4e60-87ad-aa5012ecbcd8-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-wl8mf\" (UID: \"630fc50d-827c-4e60-87ad-aa5012ecbcd8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wl8mf" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.971184 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4003ab4-f058-48d2-835e-e50ecd38cebb-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nglpx\" (UID: \"f4003ab4-f058-48d2-835e-e50ecd38cebb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nglpx" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.971572 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78753d91-e30f-43df-8c10-5f8a978a755f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n59st\" (UID: \"78753d91-e30f-43df-8c10-5f8a978a755f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n59st" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.971572 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/281cab3a-f295-45d8-90f6-7e010d5daea5-metrics-tls\") pod \"dns-operator-744455d44c-dsh4l\" (UID: \"281cab3a-f295-45d8-90f6-7e010d5daea5\") " pod="openshift-dns-operator/dns-operator-744455d44c-dsh4l" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.972237 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78753d91-e30f-43df-8c10-5f8a978a755f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n59st\" (UID: \"78753d91-e30f-43df-8c10-5f8a978a755f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n59st" Feb 28 04:13:15 crc kubenswrapper[5072]: I0228 04:13:15.988883 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckz6z\" (UniqueName: \"kubernetes.io/projected/bdabe9ca-055c-48ec-96c3-fb1ab2a342d5-kube-api-access-ckz6z\") pod \"kube-storage-version-migrator-operator-b67b599dd-rbf7l\" (UID: \"bdabe9ca-055c-48ec-96c3-fb1ab2a342d5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbf7l" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.008724 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b94a919-0f97-48a8-aac9-4f52655d572d-bound-sa-token\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.029772 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/96f3be9c-c37e-4f19-b218-a9e6f8461d02-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8p4zs\" (UID: \"96f3be9c-c37e-4f19-b218-a9e6f8461d02\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8p4zs" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.057972 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.058406 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7f7f662b-a78e-4292-871f-e1c6cbfca641-signing-key\") pod \"service-ca-9c57cc56f-jp259\" (UID: \"7f7f662b-a78e-4292-871f-e1c6cbfca641\") " pod="openshift-service-ca/service-ca-9c57cc56f-jp259" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.058970 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/49976b40-ea77-4857-a99f-f4a65df82e05-default-certificate\") pod \"router-default-5444994796-9wdtp\" (UID: \"49976b40-ea77-4857-a99f-f4a65df82e05\") " pod="openshift-ingress/router-default-5444994796-9wdtp" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.059005 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq2t7\" (UniqueName: \"kubernetes.io/projected/0caa7e2d-d3a6-4df8-a734-0d607ec64f5c-kube-api-access-sq2t7\") pod \"csi-hostpathplugin-mvt85\" (UID: \"0caa7e2d-d3a6-4df8-a734-0d607ec64f5c\") " pod="hostpath-provisioner/csi-hostpathplugin-mvt85" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.059028 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnmkz\" (UniqueName: \"kubernetes.io/projected/3644bb59-5f48-4f4d-b244-d5128487ff5f-kube-api-access-jnmkz\") pod \"olm-operator-6b444d44fb-4lstm\" (UID: \"3644bb59-5f48-4f4d-b244-d5128487ff5f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4lstm" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.059057 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49976b40-ea77-4857-a99f-f4a65df82e05-metrics-certs\") pod \"router-default-5444994796-9wdtp\" (UID: \"49976b40-ea77-4857-a99f-f4a65df82e05\") " pod="openshift-ingress/router-default-5444994796-9wdtp" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.059088 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3644bb59-5f48-4f4d-b244-d5128487ff5f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-4lstm\" (UID: \"3644bb59-5f48-4f4d-b244-d5128487ff5f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4lstm" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.059112 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d8d6896e-b13d-4c3b-b0e6-7feb24127794-webhook-cert\") pod \"packageserver-d55dfcdfc-q7b24\" (UID: \"d8d6896e-b13d-4c3b-b0e6-7feb24127794\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b24" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.059138 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0caa7e2d-d3a6-4df8-a734-0d607ec64f5c-csi-data-dir\") pod \"csi-hostpathplugin-mvt85\" (UID: \"0caa7e2d-d3a6-4df8-a734-0d607ec64f5c\") " pod="hostpath-provisioner/csi-hostpathplugin-mvt85" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.059161 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d8d6896e-b13d-4c3b-b0e6-7feb24127794-apiservice-cert\") pod \"packageserver-d55dfcdfc-q7b24\" (UID: \"d8d6896e-b13d-4c3b-b0e6-7feb24127794\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b24" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.059199 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sg8f\" (UniqueName: \"kubernetes.io/projected/d8d6896e-b13d-4c3b-b0e6-7feb24127794-kube-api-access-8sg8f\") pod \"packageserver-d55dfcdfc-q7b24\" (UID: \"d8d6896e-b13d-4c3b-b0e6-7feb24127794\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b24" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.059244 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0caa7e2d-d3a6-4df8-a734-0d607ec64f5c-registration-dir\") pod \"csi-hostpathplugin-mvt85\" (UID: \"0caa7e2d-d3a6-4df8-a734-0d607ec64f5c\") " pod="hostpath-provisioner/csi-hostpathplugin-mvt85" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.059276 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/58b4223e-e786-4022-a470-4b3c6aa754dd-profile-collector-cert\") pod \"catalog-operator-68c6474976-hxqzd\" (UID: \"58b4223e-e786-4022-a470-4b3c6aa754dd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hxqzd" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.059308 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcbc6\" (UniqueName: \"kubernetes.io/projected/58b4223e-e786-4022-a470-4b3c6aa754dd-kube-api-access-xcbc6\") pod \"catalog-operator-68c6474976-hxqzd\" (UID: \"58b4223e-e786-4022-a470-4b3c6aa754dd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hxqzd" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.059337 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3644bb59-5f48-4f4d-b244-d5128487ff5f-srv-cert\") pod \"olm-operator-6b444d44fb-4lstm\" (UID: \"3644bb59-5f48-4f4d-b244-d5128487ff5f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4lstm" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.059369 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/51561c1c-e376-4dc6-9429-b1bc39a54988-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-hklhr\" (UID: \"51561c1c-e376-4dc6-9429-b1bc39a54988\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hklhr" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.059399 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0caa7e2d-d3a6-4df8-a734-0d607ec64f5c-mountpoint-dir\") pod \"csi-hostpathplugin-mvt85\" (UID: \"0caa7e2d-d3a6-4df8-a734-0d607ec64f5c\") " pod="hostpath-provisioner/csi-hostpathplugin-mvt85" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.059425 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bd0cadda-3f93-43ac-b288-7e666a7f1b99-config-volume\") pod \"collect-profiles-29537520-d72cz\" (UID: \"bd0cadda-3f93-43ac-b288-7e666a7f1b99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-d72cz" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.059483 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7f7f662b-a78e-4292-871f-e1c6cbfca641-signing-cabundle\") pod \"service-ca-9c57cc56f-jp259\" (UID: \"7f7f662b-a78e-4292-871f-e1c6cbfca641\") " pod="openshift-service-ca/service-ca-9c57cc56f-jp259" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.059512 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmxjs\" (UniqueName: \"kubernetes.io/projected/c5d29ab8-044b-4fc5-b5eb-02c5ac608dac-kube-api-access-kmxjs\") pod \"auto-csr-approver-29537532-qwbxr\" (UID: \"c5d29ab8-044b-4fc5-b5eb-02c5ac608dac\") " pod="openshift-infra/auto-csr-approver-29537532-qwbxr" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.059541 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsjmk\" (UniqueName: \"kubernetes.io/projected/5a1b591b-8355-4747-8858-baf81ce6928d-kube-api-access-dsjmk\") pod \"ingress-canary-md7mn\" (UID: \"5a1b591b-8355-4747-8858-baf81ce6928d\") " pod="openshift-ingress-canary/ingress-canary-md7mn" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.059572 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/56c867eb-6fd4-476a-8317-9f590f2ff47a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-stshz\" (UID: \"56c867eb-6fd4-476a-8317-9f590f2ff47a\") " pod="openshift-marketplace/marketplace-operator-79b997595-stshz" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.059601 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/58b4223e-e786-4022-a470-4b3c6aa754dd-srv-cert\") pod \"catalog-operator-68c6474976-hxqzd\" (UID: \"58b4223e-e786-4022-a470-4b3c6aa754dd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hxqzd" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.059632 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghwrz\" (UniqueName: \"kubernetes.io/projected/bd0cadda-3f93-43ac-b288-7e666a7f1b99-kube-api-access-ghwrz\") pod \"collect-profiles-29537520-d72cz\" (UID: \"bd0cadda-3f93-43ac-b288-7e666a7f1b99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-d72cz" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.059684 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2zjk\" (UniqueName: \"kubernetes.io/projected/c329b662-2d9c-4a89-b244-76d7cd9c5c5c-kube-api-access-g2zjk\") pod \"service-ca-operator-777779d784-6htld\" (UID: \"c329b662-2d9c-4a89-b244-76d7cd9c5c5c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6htld" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.059713 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq4zd\" (UniqueName: \"kubernetes.io/projected/51561c1c-e376-4dc6-9429-b1bc39a54988-kube-api-access-nq4zd\") pod \"package-server-manager-789f6589d5-hklhr\" (UID: \"51561c1c-e376-4dc6-9429-b1bc39a54988\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hklhr" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.059763 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac478b24-16b1-460c-97d2-6be88f80ed94-config-volume\") pod \"dns-default-wnkn9\" (UID: \"ac478b24-16b1-460c-97d2-6be88f80ed94\") " pod="openshift-dns/dns-default-wnkn9" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.059795 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/908db76f-57be-45ae-9ebe-40d4e707fc5c-certs\") pod \"machine-config-server-4qfjg\" (UID: \"908db76f-57be-45ae-9ebe-40d4e707fc5c\") " pod="openshift-machine-config-operator/machine-config-server-4qfjg" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.059845 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz8hh\" (UniqueName: \"kubernetes.io/projected/908db76f-57be-45ae-9ebe-40d4e707fc5c-kube-api-access-lz8hh\") pod \"machine-config-server-4qfjg\" (UID: \"908db76f-57be-45ae-9ebe-40d4e707fc5c\") " pod="openshift-machine-config-operator/machine-config-server-4qfjg" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.059882 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a1b591b-8355-4747-8858-baf81ce6928d-cert\") pod \"ingress-canary-md7mn\" (UID: \"5a1b591b-8355-4747-8858-baf81ce6928d\") " pod="openshift-ingress-canary/ingress-canary-md7mn" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.059914 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txdpk\" (UniqueName: \"kubernetes.io/projected/61d5e54f-0eeb-4b59-9511-cdf807911640-kube-api-access-txdpk\") pod \"multus-admission-controller-857f4d67dd-64249\" (UID: \"61d5e54f-0eeb-4b59-9511-cdf807911640\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-64249" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.059943 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h9t2\" (UniqueName: \"kubernetes.io/projected/ac478b24-16b1-460c-97d2-6be88f80ed94-kube-api-access-6h9t2\") pod \"dns-default-wnkn9\" (UID: \"ac478b24-16b1-460c-97d2-6be88f80ed94\") " pod="openshift-dns/dns-default-wnkn9" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.059975 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0caa7e2d-d3a6-4df8-a734-0d607ec64f5c-socket-dir\") pod \"csi-hostpathplugin-mvt85\" (UID: \"0caa7e2d-d3a6-4df8-a734-0d607ec64f5c\") " pod="hostpath-provisioner/csi-hostpathplugin-mvt85" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.060004 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/908db76f-57be-45ae-9ebe-40d4e707fc5c-node-bootstrap-token\") pod \"machine-config-server-4qfjg\" (UID: \"908db76f-57be-45ae-9ebe-40d4e707fc5c\") " pod="openshift-machine-config-operator/machine-config-server-4qfjg" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.060046 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac478b24-16b1-460c-97d2-6be88f80ed94-metrics-tls\") pod \"dns-default-wnkn9\" (UID: \"ac478b24-16b1-460c-97d2-6be88f80ed94\") " pod="openshift-dns/dns-default-wnkn9" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.060094 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjn5w\" (UniqueName: \"kubernetes.io/projected/56c867eb-6fd4-476a-8317-9f590f2ff47a-kube-api-access-hjn5w\") pod \"marketplace-operator-79b997595-stshz\" (UID: \"56c867eb-6fd4-476a-8317-9f590f2ff47a\") " pod="openshift-marketplace/marketplace-operator-79b997595-stshz" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.060127 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d8d6896e-b13d-4c3b-b0e6-7feb24127794-tmpfs\") pod \"packageserver-d55dfcdfc-q7b24\" (UID: \"d8d6896e-b13d-4c3b-b0e6-7feb24127794\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b24" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.060162 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/56c867eb-6fd4-476a-8317-9f590f2ff47a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-stshz\" (UID: \"56c867eb-6fd4-476a-8317-9f590f2ff47a\") " pod="openshift-marketplace/marketplace-operator-79b997595-stshz" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.060190 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z57rp\" (UniqueName: \"kubernetes.io/projected/49976b40-ea77-4857-a99f-f4a65df82e05-kube-api-access-z57rp\") pod \"router-default-5444994796-9wdtp\" (UID: \"49976b40-ea77-4857-a99f-f4a65df82e05\") " pod="openshift-ingress/router-default-5444994796-9wdtp" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.060233 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h6z9\" (UniqueName: \"kubernetes.io/projected/7f7f662b-a78e-4292-871f-e1c6cbfca641-kube-api-access-4h6z9\") pod \"service-ca-9c57cc56f-jp259\" (UID: \"7f7f662b-a78e-4292-871f-e1c6cbfca641\") " pod="openshift-service-ca/service-ca-9c57cc56f-jp259" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.060262 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49976b40-ea77-4857-a99f-f4a65df82e05-service-ca-bundle\") pod \"router-default-5444994796-9wdtp\" (UID: \"49976b40-ea77-4857-a99f-f4a65df82e05\") " pod="openshift-ingress/router-default-5444994796-9wdtp" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.060299 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c329b662-2d9c-4a89-b244-76d7cd9c5c5c-serving-cert\") pod \"service-ca-operator-777779d784-6htld\" (UID: \"c329b662-2d9c-4a89-b244-76d7cd9c5c5c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6htld" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.060331 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0caa7e2d-d3a6-4df8-a734-0d607ec64f5c-plugins-dir\") pod \"csi-hostpathplugin-mvt85\" (UID: \"0caa7e2d-d3a6-4df8-a734-0d607ec64f5c\") " pod="hostpath-provisioner/csi-hostpathplugin-mvt85" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.060362 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/49976b40-ea77-4857-a99f-f4a65df82e05-stats-auth\") pod \"router-default-5444994796-9wdtp\" (UID: \"49976b40-ea77-4857-a99f-f4a65df82e05\") " pod="openshift-ingress/router-default-5444994796-9wdtp" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.060395 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/61d5e54f-0eeb-4b59-9511-cdf807911640-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-64249\" (UID: \"61d5e54f-0eeb-4b59-9511-cdf807911640\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-64249" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.060425 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bd0cadda-3f93-43ac-b288-7e666a7f1b99-secret-volume\") pod \"collect-profiles-29537520-d72cz\" (UID: \"bd0cadda-3f93-43ac-b288-7e666a7f1b99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-d72cz" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.060456 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c329b662-2d9c-4a89-b244-76d7cd9c5c5c-config\") pod \"service-ca-operator-777779d784-6htld\" (UID: \"c329b662-2d9c-4a89-b244-76d7cd9c5c5c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6htld" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.060953 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0caa7e2d-d3a6-4df8-a734-0d607ec64f5c-csi-data-dir\") pod \"csi-hostpathplugin-mvt85\" (UID: \"0caa7e2d-d3a6-4df8-a734-0d607ec64f5c\") " pod="hostpath-provisioner/csi-hostpathplugin-mvt85" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.061161 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0caa7e2d-d3a6-4df8-a734-0d607ec64f5c-socket-dir\") pod \"csi-hostpathplugin-mvt85\" (UID: \"0caa7e2d-d3a6-4df8-a734-0d607ec64f5c\") " pod="hostpath-provisioner/csi-hostpathplugin-mvt85" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.061222 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7f7f662b-a78e-4292-871f-e1c6cbfca641-signing-cabundle\") pod \"service-ca-9c57cc56f-jp259\" (UID: \"7f7f662b-a78e-4292-871f-e1c6cbfca641\") " pod="openshift-service-ca/service-ca-9c57cc56f-jp259" Feb 28 04:13:16 crc kubenswrapper[5072]: E0228 04:13:16.061422 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:16.561376449 +0000 UTC m=+218.556106831 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.061897 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0caa7e2d-d3a6-4df8-a734-0d607ec64f5c-registration-dir\") pod \"csi-hostpathplugin-mvt85\" (UID: \"0caa7e2d-d3a6-4df8-a734-0d607ec64f5c\") " pod="hostpath-provisioner/csi-hostpathplugin-mvt85" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.062108 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0caa7e2d-d3a6-4df8-a734-0d607ec64f5c-mountpoint-dir\") pod \"csi-hostpathplugin-mvt85\" (UID: \"0caa7e2d-d3a6-4df8-a734-0d607ec64f5c\") " pod="hostpath-provisioner/csi-hostpathplugin-mvt85" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.066028 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/58b4223e-e786-4022-a470-4b3c6aa754dd-profile-collector-cert\") pod \"catalog-operator-68c6474976-hxqzd\" (UID: \"58b4223e-e786-4022-a470-4b3c6aa754dd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hxqzd" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.066198 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/56c867eb-6fd4-476a-8317-9f590f2ff47a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-stshz\" (UID: \"56c867eb-6fd4-476a-8317-9f590f2ff47a\") " pod="openshift-marketplace/marketplace-operator-79b997595-stshz" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.067260 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/51561c1c-e376-4dc6-9429-b1bc39a54988-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-hklhr\" (UID: \"51561c1c-e376-4dc6-9429-b1bc39a54988\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hklhr" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.070975 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3644bb59-5f48-4f4d-b244-d5128487ff5f-srv-cert\") pod \"olm-operator-6b444d44fb-4lstm\" (UID: \"3644bb59-5f48-4f4d-b244-d5128487ff5f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4lstm" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.072010 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7f7f662b-a78e-4292-871f-e1c6cbfca641-signing-key\") pod \"service-ca-9c57cc56f-jp259\" (UID: \"7f7f662b-a78e-4292-871f-e1c6cbfca641\") " pod="openshift-service-ca/service-ca-9c57cc56f-jp259" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.072183 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4k4q\" (UniqueName: \"kubernetes.io/projected/3b94a919-0f97-48a8-aac9-4f52655d572d-kube-api-access-w4k4q\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.072867 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d8d6896e-b13d-4c3b-b0e6-7feb24127794-tmpfs\") pod \"packageserver-d55dfcdfc-q7b24\" (UID: \"d8d6896e-b13d-4c3b-b0e6-7feb24127794\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b24" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.073752 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bd0cadda-3f93-43ac-b288-7e666a7f1b99-config-volume\") pod \"collect-profiles-29537520-d72cz\" (UID: \"bd0cadda-3f93-43ac-b288-7e666a7f1b99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-d72cz" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.074225 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d8d6896e-b13d-4c3b-b0e6-7feb24127794-apiservice-cert\") pod \"packageserver-d55dfcdfc-q7b24\" (UID: \"d8d6896e-b13d-4c3b-b0e6-7feb24127794\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b24" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.074242 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49976b40-ea77-4857-a99f-f4a65df82e05-service-ca-bundle\") pod \"router-default-5444994796-9wdtp\" (UID: \"49976b40-ea77-4857-a99f-f4a65df82e05\") " pod="openshift-ingress/router-default-5444994796-9wdtp" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.075006 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/908db76f-57be-45ae-9ebe-40d4e707fc5c-node-bootstrap-token\") pod \"machine-config-server-4qfjg\" (UID: \"908db76f-57be-45ae-9ebe-40d4e707fc5c\") " pod="openshift-machine-config-operator/machine-config-server-4qfjg" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.075083 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0caa7e2d-d3a6-4df8-a734-0d607ec64f5c-plugins-dir\") pod \"csi-hostpathplugin-mvt85\" (UID: \"0caa7e2d-d3a6-4df8-a734-0d607ec64f5c\") " pod="hostpath-provisioner/csi-hostpathplugin-mvt85" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.075547 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d8d6896e-b13d-4c3b-b0e6-7feb24127794-webhook-cert\") pod \"packageserver-d55dfcdfc-q7b24\" (UID: \"d8d6896e-b13d-4c3b-b0e6-7feb24127794\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b24" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.075839 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49976b40-ea77-4857-a99f-f4a65df82e05-metrics-certs\") pod \"router-default-5444994796-9wdtp\" (UID: \"49976b40-ea77-4857-a99f-f4a65df82e05\") " pod="openshift-ingress/router-default-5444994796-9wdtp" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.075988 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a1b591b-8355-4747-8858-baf81ce6928d-cert\") pod \"ingress-canary-md7mn\" (UID: \"5a1b591b-8355-4747-8858-baf81ce6928d\") " pod="openshift-ingress-canary/ingress-canary-md7mn" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.076483 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/49976b40-ea77-4857-a99f-f4a65df82e05-stats-auth\") pod \"router-default-5444994796-9wdtp\" (UID: \"49976b40-ea77-4857-a99f-f4a65df82e05\") " pod="openshift-ingress/router-default-5444994796-9wdtp" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.077231 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3644bb59-5f48-4f4d-b244-d5128487ff5f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-4lstm\" (UID: \"3644bb59-5f48-4f4d-b244-d5128487ff5f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4lstm" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.078384 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/61d5e54f-0eeb-4b59-9511-cdf807911640-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-64249\" (UID: \"61d5e54f-0eeb-4b59-9511-cdf807911640\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-64249" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.079080 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c329b662-2d9c-4a89-b244-76d7cd9c5c5c-serving-cert\") pod \"service-ca-operator-777779d784-6htld\" (UID: \"c329b662-2d9c-4a89-b244-76d7cd9c5c5c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6htld" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.079281 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c329b662-2d9c-4a89-b244-76d7cd9c5c5c-config\") pod \"service-ca-operator-777779d784-6htld\" (UID: \"c329b662-2d9c-4a89-b244-76d7cd9c5c5c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6htld" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.079597 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bd0cadda-3f93-43ac-b288-7e666a7f1b99-secret-volume\") pod \"collect-profiles-29537520-d72cz\" (UID: \"bd0cadda-3f93-43ac-b288-7e666a7f1b99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-d72cz" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.089251 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac478b24-16b1-460c-97d2-6be88f80ed94-config-volume\") pod \"dns-default-wnkn9\" (UID: \"ac478b24-16b1-460c-97d2-6be88f80ed94\") " pod="openshift-dns/dns-default-wnkn9" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.094205 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac478b24-16b1-460c-97d2-6be88f80ed94-metrics-tls\") pod \"dns-default-wnkn9\" (UID: \"ac478b24-16b1-460c-97d2-6be88f80ed94\") " pod="openshift-dns/dns-default-wnkn9" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.096268 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-dsh4l" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.097700 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/56c867eb-6fd4-476a-8317-9f590f2ff47a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-stshz\" (UID: \"56c867eb-6fd4-476a-8317-9f590f2ff47a\") " pod="openshift-marketplace/marketplace-operator-79b997595-stshz" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.098818 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktfmb\" (UniqueName: \"kubernetes.io/projected/672db961-8de6-46ec-9dd8-5d2ef7572eef-kube-api-access-ktfmb\") pod \"control-plane-machine-set-operator-78cbb6b69f-zc5mk\" (UID: \"672db961-8de6-46ec-9dd8-5d2ef7572eef\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zc5mk" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.101021 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/908db76f-57be-45ae-9ebe-40d4e707fc5c-certs\") pod \"machine-config-server-4qfjg\" (UID: \"908db76f-57be-45ae-9ebe-40d4e707fc5c\") " pod="openshift-machine-config-operator/machine-config-server-4qfjg" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.101063 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/49976b40-ea77-4857-a99f-f4a65df82e05-default-certificate\") pod \"router-default-5444994796-9wdtp\" (UID: \"49976b40-ea77-4857-a99f-f4a65df82e05\") " pod="openshift-ingress/router-default-5444994796-9wdtp" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.103045 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/58b4223e-e786-4022-a470-4b3c6aa754dd-srv-cert\") pod \"catalog-operator-68c6474976-hxqzd\" (UID: \"58b4223e-e786-4022-a470-4b3c6aa754dd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hxqzd" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.113411 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbf7l" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.127355 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnlvr\" (UniqueName: \"kubernetes.io/projected/9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec-kube-api-access-bnlvr\") pod \"etcd-operator-b45778765-x4ltm\" (UID: \"9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x4ltm" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.145911 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n59st" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.148592 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5vtg\" (UniqueName: \"kubernetes.io/projected/96f3be9c-c37e-4f19-b218-a9e6f8461d02-kube-api-access-x5vtg\") pod \"cluster-image-registry-operator-dc59b4c8b-8p4zs\" (UID: \"96f3be9c-c37e-4f19-b218-a9e6f8461d02\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8p4zs" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.156836 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lmcz\" (UniqueName: \"kubernetes.io/projected/e440ea37-6f17-4863-a958-9e4b9debe3e3-kube-api-access-2lmcz\") pod \"machine-config-controller-84d6567774-dsp7s\" (UID: \"e440ea37-6f17-4863-a958-9e4b9debe3e3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dsp7s" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.158813 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-58vm7" event={"ID":"0bc940ed-4de2-4e88-98d7-8d9de59cd63d","Type":"ContainerStarted","Data":"19d8ad6146b49f2d3768e0faea9f892396aa650e3af113b5bf550b8b5fd9b953"} Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.158870 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-58vm7" event={"ID":"0bc940ed-4de2-4e88-98d7-8d9de59cd63d","Type":"ContainerStarted","Data":"24e24b2219a19a5dede0389a5cc36d5fd02a8f2c7c43725511c7beed859a2a52"} Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.168305 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:16 crc kubenswrapper[5072]: E0228 04:13:16.168745 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:16.668723821 +0000 UTC m=+218.663454073 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.177816 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" event={"ID":"497b9208-4958-46e8-8aeb-8bc2e0f172d6","Type":"ContainerStarted","Data":"4e30009cfd4d02a210889216b118c12167b9b3aa481cf292922af1510622cd8a"} Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.179445 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.179547 5072 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-4d9s6 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" start-of-body= Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.179581 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" podUID="497b9208-4958-46e8-8aeb-8bc2e0f172d6" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.179884 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dsp7s" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.181190 5072 generic.go:334] "Generic (PLEG): container finished" podID="c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8" containerID="051cf58277a4ea8a2e939b21f78c854bc224ccc8d87f637c989957a24a5ed0a9" exitCode=0 Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.181232 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" event={"ID":"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8","Type":"ContainerDied","Data":"051cf58277a4ea8a2e939b21f78c854bc224ccc8d87f637c989957a24a5ed0a9"} Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.181250 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" event={"ID":"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8","Type":"ContainerStarted","Data":"e18cde134b23027c253ddafe45940b2abe93468605d6b5cc600e0e5bdb7eb965"} Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.187350 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdnkt" event={"ID":"5047b83c-e577-419c-b2e0-05c6f7baaef3","Type":"ContainerStarted","Data":"5b3a04b21a957958ade98b7e6837541212184c1bace20e4241191945322a2e76"} Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.196996 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nglpx" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.197975 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbm42" event={"ID":"9035860a-a3e9-439c-bfed-89b060a0bdc5","Type":"ContainerStarted","Data":"2d72feba3947843e79466d9daad4199e56072fc885b9d8481a72c8c855d4d2d6"} Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.198027 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbm42" event={"ID":"9035860a-a3e9-439c-bfed-89b060a0bdc5","Type":"ContainerStarted","Data":"cd7693a6131e504c58b89be9fa72c7f2cc4d227f3cfea8b70d66021d41d98237"} Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.199465 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcbc6\" (UniqueName: \"kubernetes.io/projected/58b4223e-e786-4022-a470-4b3c6aa754dd-kube-api-access-xcbc6\") pod \"catalog-operator-68c6474976-hxqzd\" (UID: \"58b4223e-e786-4022-a470-4b3c6aa754dd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hxqzd" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.207904 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmxjs\" (UniqueName: \"kubernetes.io/projected/c5d29ab8-044b-4fc5-b5eb-02c5ac608dac-kube-api-access-kmxjs\") pod \"auto-csr-approver-29537532-qwbxr\" (UID: \"c5d29ab8-044b-4fc5-b5eb-02c5ac608dac\") " pod="openshift-infra/auto-csr-approver-29537532-qwbxr" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.208185 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dcccb" event={"ID":"3b41acd5-2f6b-48fd-a9ba-55796e6db653","Type":"ContainerStarted","Data":"67c2a17ca7a758951d1bff9a14bb12a1de825717f63d7a3d3f74a65fe2a48f2e"} Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.208297 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dcccb" event={"ID":"3b41acd5-2f6b-48fd-a9ba-55796e6db653","Type":"ContainerStarted","Data":"5020feb8d5301cfd61745896ccbd8c1f8791245b4c8dd43d94960310d7c47b7a"} Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.208328 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-dcccb" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.210032 5072 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-dcccb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.210085 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-dcccb" podUID="3b41acd5-2f6b-48fd-a9ba-55796e6db653" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.214003 5072 generic.go:334] "Generic (PLEG): container finished" podID="b1e546a4-808d-4c86-a8ab-274186b278a6" containerID="b07cbd3391f46538b04555851a72dc75ee08751d7f8d8e1e29bd873ef3b19fc7" exitCode=0 Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.214101 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" event={"ID":"b1e546a4-808d-4c86-a8ab-274186b278a6","Type":"ContainerDied","Data":"b07cbd3391f46538b04555851a72dc75ee08751d7f8d8e1e29bd873ef3b19fc7"} Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.215919 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zc5mk" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.225779 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9vfgz" event={"ID":"707bbe1d-eb9e-4d9d-8e70-e88429b8c077","Type":"ContainerStarted","Data":"21bbc52ef8184dbca5965c8d5bc9fe8b534006f02f7c3d184a280d382dc69812"} Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.225836 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9vfgz" event={"ID":"707bbe1d-eb9e-4d9d-8e70-e88429b8c077","Type":"ContainerStarted","Data":"89377d3c32cc52db1c621fbf688ad6c065e651cbeca790cb763f8aa50b58d376"} Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.225852 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9vfgz" event={"ID":"707bbe1d-eb9e-4d9d-8e70-e88429b8c077","Type":"ContainerStarted","Data":"7e91b2258b67cd6aa89991e623faae6612e6616f429cb4fff7ef133e7a1071d3"} Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.231495 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-49m8v" event={"ID":"c36ce709-c726-4390-abb9-2ebcaecbf1c0","Type":"ContainerStarted","Data":"7d4ad3a8c5e1c0241183662da3e96bf17c32d1e3dcd80bb9de19219b1879d5d1"} Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.231536 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-49m8v" event={"ID":"c36ce709-c726-4390-abb9-2ebcaecbf1c0","Type":"ContainerStarted","Data":"c026552cdd1b67c4acbbf2c2e2ef9aeacaa9354364f29d18704f8f7a52dcf4a4"} Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.232456 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-49m8v" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.235870 5072 patch_prober.go:28] interesting pod/console-operator-58897d9998-49m8v container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.235909 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-49m8v" podUID="c36ce709-c726-4390-abb9-2ebcaecbf1c0" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.236060 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gf64p" event={"ID":"99ab1bd5-e727-47e3-8f77-78f5a5795c7c","Type":"ContainerStarted","Data":"9d1d022eae0701a5b1931eb742916498bb447a951b11c274b02f0309566b37cc"} Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.236085 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gf64p" event={"ID":"99ab1bd5-e727-47e3-8f77-78f5a5795c7c","Type":"ContainerStarted","Data":"564f8fa56c82dfce0c1d1596d502cd9d19e295c9b18742c53d3833f2f7d40fb1"} Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.243130 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kmrvp" event={"ID":"280cee64-7ec9-4dd4-9fb8-931b2d7a5818","Type":"ContainerStarted","Data":"8caa6763053058ac0e33ce497865ba4ee49d0fc1ce1979cac027ea94506181df"} Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.243187 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kmrvp" event={"ID":"280cee64-7ec9-4dd4-9fb8-931b2d7a5818","Type":"ContainerStarted","Data":"0a133e7288ec2e6a2deab5a79ba9aa2221a8546c38f4ba27c357dd7350829d1a"} Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.245195 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wl8mf" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.246630 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-bsswj" event={"ID":"f0aaa88f-ecf9-47b0-9349-737e855a9ed4","Type":"ContainerStarted","Data":"af501b8c743bb95ade214c6fcd01145be4217bb22e65e32d4784c84c8275eb0a"} Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.249863 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrbwn" event={"ID":"0fb104aa-1ec4-4941-acfb-8d8458df7d4a","Type":"ContainerStarted","Data":"23996e0c78a982118d6ee651c65ee99b5fca4ccbc8b709487719a3e0f784b3b2"} Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.250365 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrbwn" event={"ID":"0fb104aa-1ec4-4941-acfb-8d8458df7d4a","Type":"ContainerStarted","Data":"58e8134368677337cc3e1e9e8e6975519c23c2d57252db9ea9031aff9d58bbf6"} Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.254911 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsjmk\" (UniqueName: \"kubernetes.io/projected/5a1b591b-8355-4747-8858-baf81ce6928d-kube-api-access-dsjmk\") pod \"ingress-canary-md7mn\" (UID: \"5a1b591b-8355-4747-8858-baf81ce6928d\") " pod="openshift-ingress-canary/ingress-canary-md7mn" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.256458 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sg8f\" (UniqueName: \"kubernetes.io/projected/d8d6896e-b13d-4c3b-b0e6-7feb24127794-kube-api-access-8sg8f\") pod \"packageserver-d55dfcdfc-q7b24\" (UID: \"d8d6896e-b13d-4c3b-b0e6-7feb24127794\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b24" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.257770 5072 generic.go:334] "Generic (PLEG): container finished" podID="37da02ec-0d39-471d-93ea-5cca2236656d" containerID="3f529ef3a11c7993b556c278cf4e3c195b025a39f446512e1bd4e66a18f62f8e" exitCode=0 Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.257847 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4r96" event={"ID":"37da02ec-0d39-471d-93ea-5cca2236656d","Type":"ContainerDied","Data":"3f529ef3a11c7993b556c278cf4e3c195b025a39f446512e1bd4e66a18f62f8e"} Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.257885 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4r96" event={"ID":"37da02ec-0d39-471d-93ea-5cca2236656d","Type":"ContainerStarted","Data":"bb2b77d2742ebaef85ba089fcf1e76914c3795041a2627ec953c43f993d0c656"} Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.266886 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnmkz\" (UniqueName: \"kubernetes.io/projected/3644bb59-5f48-4f4d-b244-d5128487ff5f-kube-api-access-jnmkz\") pod \"olm-operator-6b444d44fb-4lstm\" (UID: \"3644bb59-5f48-4f4d-b244-d5128487ff5f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4lstm" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.269883 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:16 crc kubenswrapper[5072]: E0228 04:13:16.270374 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:16.770350646 +0000 UTC m=+218.765080838 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.272816 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:16 crc kubenswrapper[5072]: E0228 04:13:16.275946 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:16.775926989 +0000 UTC m=+218.770657181 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.292214 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hxqzd" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.306884 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4lstm" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.307901 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-flgmc" event={"ID":"39c4f31f-248a-42ad-9aa0-166f036be3ac","Type":"ContainerStarted","Data":"08c619beda1285f2967d5322b16795b71c5b983a53db422695f488598fb5ac47"} Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.310781 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j55ft" event={"ID":"c7d7baf9-5adb-4cc1-a07b-d093bcd78e30","Type":"ContainerStarted","Data":"cdbeeed7d6697f3f1580ffc4f26dd1a3313486578e66be025ddb8d5720b6a621"} Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.311789 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j55ft" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.314600 5072 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-j55ft container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.314710 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j55ft" podUID="c7d7baf9-5adb-4cc1-a07b-d093bcd78e30" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.316486 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t44gr" event={"ID":"a2b10a22-bf51-42f1-82e7-23b09dc84d3a","Type":"ContainerStarted","Data":"600e5fd94b6a4698391824780b7464d9525fe0d91ae1b83d589fa9834b51d261"} Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.316651 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t44gr" event={"ID":"a2b10a22-bf51-42f1-82e7-23b09dc84d3a","Type":"ContainerStarted","Data":"42b640c90025946c4268fb417bec199f3bb9c8233c6eb1a86fe467a21d3de598"} Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.317461 5072 patch_prober.go:28] interesting pod/downloads-7954f5f757-mzmcb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.317522 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mzmcb" podUID="dbdad8a2-b26c-4587-8a9c-cdf96b65c15f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.320741 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz8hh\" (UniqueName: \"kubernetes.io/projected/908db76f-57be-45ae-9ebe-40d4e707fc5c-kube-api-access-lz8hh\") pod \"machine-config-server-4qfjg\" (UID: \"908db76f-57be-45ae-9ebe-40d4e707fc5c\") " pod="openshift-machine-config-operator/machine-config-server-4qfjg" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.323343 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b24" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.323498 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq2t7\" (UniqueName: \"kubernetes.io/projected/0caa7e2d-d3a6-4df8-a734-0d607ec64f5c-kube-api-access-sq2t7\") pod \"csi-hostpathplugin-mvt85\" (UID: \"0caa7e2d-d3a6-4df8-a734-0d607ec64f5c\") " pod="hostpath-provisioner/csi-hostpathplugin-mvt85" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.344282 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txdpk\" (UniqueName: \"kubernetes.io/projected/61d5e54f-0eeb-4b59-9511-cdf807911640-kube-api-access-txdpk\") pod \"multus-admission-controller-857f4d67dd-64249\" (UID: \"61d5e54f-0eeb-4b59-9511-cdf807911640\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-64249" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.348342 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2zjk\" (UniqueName: \"kubernetes.io/projected/c329b662-2d9c-4a89-b244-76d7cd9c5c5c-kube-api-access-g2zjk\") pod \"service-ca-operator-777779d784-6htld\" (UID: \"c329b662-2d9c-4a89-b244-76d7cd9c5c5c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6htld" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.368867 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537532-qwbxr" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.370183 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghwrz\" (UniqueName: \"kubernetes.io/projected/bd0cadda-3f93-43ac-b288-7e666a7f1b99-kube-api-access-ghwrz\") pod \"collect-profiles-29537520-d72cz\" (UID: \"bd0cadda-3f93-43ac-b288-7e666a7f1b99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-d72cz" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.374047 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:16 crc kubenswrapper[5072]: E0228 04:13:16.374322 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:16.874302232 +0000 UTC m=+218.869032424 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.374457 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:16 crc kubenswrapper[5072]: E0228 04:13:16.383324 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:16.883309702 +0000 UTC m=+218.878039884 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.386069 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8p4zs" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.396750 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-mvt85" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.412813 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-4qfjg" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.413803 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-md7mn" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.419155 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-x4ltm" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.438512 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h6z9\" (UniqueName: \"kubernetes.io/projected/7f7f662b-a78e-4292-871f-e1c6cbfca641-kube-api-access-4h6z9\") pod \"service-ca-9c57cc56f-jp259\" (UID: \"7f7f662b-a78e-4292-871f-e1c6cbfca641\") " pod="openshift-service-ca/service-ca-9c57cc56f-jp259" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.443557 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z57rp\" (UniqueName: \"kubernetes.io/projected/49976b40-ea77-4857-a99f-f4a65df82e05-kube-api-access-z57rp\") pod \"router-default-5444994796-9wdtp\" (UID: \"49976b40-ea77-4857-a99f-f4a65df82e05\") " pod="openshift-ingress/router-default-5444994796-9wdtp" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.448343 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq4zd\" (UniqueName: \"kubernetes.io/projected/51561c1c-e376-4dc6-9429-b1bc39a54988-kube-api-access-nq4zd\") pod \"package-server-manager-789f6589d5-hklhr\" (UID: \"51561c1c-e376-4dc6-9429-b1bc39a54988\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hklhr" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.473159 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjn5w\" (UniqueName: \"kubernetes.io/projected/56c867eb-6fd4-476a-8317-9f590f2ff47a-kube-api-access-hjn5w\") pod \"marketplace-operator-79b997595-stshz\" (UID: \"56c867eb-6fd4-476a-8317-9f590f2ff47a\") " pod="openshift-marketplace/marketplace-operator-79b997595-stshz" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.477832 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:16 crc kubenswrapper[5072]: E0228 04:13:16.479374 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:16.979268831 +0000 UTC m=+218.973999133 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.486953 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h9t2\" (UniqueName: \"kubernetes.io/projected/ac478b24-16b1-460c-97d2-6be88f80ed94-kube-api-access-6h9t2\") pod \"dns-default-wnkn9\" (UID: \"ac478b24-16b1-460c-97d2-6be88f80ed94\") " pod="openshift-dns/dns-default-wnkn9" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.569170 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-stshz" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.604614 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-64249" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.604672 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-9wdtp" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.606133 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:16 crc kubenswrapper[5072]: E0228 04:13:16.607311 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:17.107294134 +0000 UTC m=+219.102024326 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.615202 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hklhr" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.630682 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6htld" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.648598 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-d72cz" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.648902 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-jp259" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.677070 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wnkn9" Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.707806 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:16 crc kubenswrapper[5072]: E0228 04:13:16.708347 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:17.208331931 +0000 UTC m=+219.203062113 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.762581 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-dsh4l"] Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.809427 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:16 crc kubenswrapper[5072]: E0228 04:13:16.809748 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:17.309737919 +0000 UTC m=+219.304468111 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.810589 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbf7l"] Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.910317 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:16 crc kubenswrapper[5072]: E0228 04:13:16.910583 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:17.410558268 +0000 UTC m=+219.405288460 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:16 crc kubenswrapper[5072]: I0228 04:13:16.911280 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:16 crc kubenswrapper[5072]: E0228 04:13:16.911623 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:17.41161194 +0000 UTC m=+219.406342132 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.002771 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nglpx"] Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.014853 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:17 crc kubenswrapper[5072]: E0228 04:13:17.015561 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:17.515534086 +0000 UTC m=+219.510264278 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.104774 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n59st"] Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.124777 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:17 crc kubenswrapper[5072]: E0228 04:13:17.125165 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:17.625147569 +0000 UTC m=+219.619877761 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.229128 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:17 crc kubenswrapper[5072]: E0228 04:13:17.229890 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:17.72987411 +0000 UTC m=+219.724604302 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.235078 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dsp7s"] Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.278596 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zc5mk"] Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.329403 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n59st" event={"ID":"78753d91-e30f-43df-8c10-5f8a978a755f","Type":"ContainerStarted","Data":"03664cb08b9950ceea2f075b4eef709e964aff0a5fe53a4eed0f80807d9746d0"} Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.331545 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:17 crc kubenswrapper[5072]: E0228 04:13:17.331871 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:17.831858475 +0000 UTC m=+219.826588657 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.346135 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbf7l" event={"ID":"bdabe9ca-055c-48ec-96c3-fb1ab2a342d5","Type":"ContainerStarted","Data":"da7a500d6defe6af344a50f317ca9bf3254578768b24bf3343beb08dcbe7cc8a"} Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.352803 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-dsh4l" event={"ID":"281cab3a-f295-45d8-90f6-7e010d5daea5","Type":"ContainerStarted","Data":"3b50ca0f27a0bfbade6f786811fe235cf1c8d65433d01f8e1e9ccdc47e97c3f0"} Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.354865 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-9wdtp" event={"ID":"49976b40-ea77-4857-a99f-f4a65df82e05","Type":"ContainerStarted","Data":"7b5c67c277512f1c4a0f00866d103347da53095b4bc07f61e4b5ad97bf28f33b"} Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.361574 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrbwn" event={"ID":"0fb104aa-1ec4-4941-acfb-8d8458df7d4a","Type":"ContainerStarted","Data":"36ebf93bc8c31229db07b8dae6d4c2df679b60a0b6e2cf3f31265dea95a0b11e"} Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.373602 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" event={"ID":"b1e546a4-808d-4c86-a8ab-274186b278a6","Type":"ContainerStarted","Data":"1b8d3b67032c2f3e6d52ba154aa8e68d2113c9ffe35467e36c827356d5d863de"} Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.377039 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gf64p" event={"ID":"99ab1bd5-e727-47e3-8f77-78f5a5795c7c","Type":"ContainerStarted","Data":"fcf2b97aa8567e1a4541426648d74a66b0583dee3f37520f234e04db73314a61"} Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.381149 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nglpx" event={"ID":"f4003ab4-f058-48d2-835e-e50ecd38cebb","Type":"ContainerStarted","Data":"f248e8e3fdcba127f3d9794a34b0a440817c3cbabca619e6fd5368d8fa09ef3b"} Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.383472 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" event={"ID":"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8","Type":"ContainerStarted","Data":"79c616cdc4280b640fb31f3af3052a737f3198eabc7228da4b48c88644462d26"} Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.385965 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-4qfjg" event={"ID":"908db76f-57be-45ae-9ebe-40d4e707fc5c","Type":"ContainerStarted","Data":"ba87d26f853b830db320df99b389c45166222004c952503362b883964754c883"} Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.387936 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbm42" event={"ID":"9035860a-a3e9-439c-bfed-89b060a0bdc5","Type":"ContainerStarted","Data":"22c964396d48cc362810e42b9f8118a095cdf73879649d2a0b3c2616e88b5dac"} Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.398698 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4r96" event={"ID":"37da02ec-0d39-471d-93ea-5cca2236656d","Type":"ContainerStarted","Data":"a5fc0780f5ed07ea019df3cf1382e106a9f4b322301a5a41d4fc38b9c72427d0"} Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.399678 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4r96" Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.412033 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-flgmc" event={"ID":"39c4f31f-248a-42ad-9aa0-166f036be3ac","Type":"ContainerStarted","Data":"c38080263ee59f4a8543d73bc4f2ce9dadb4450859b4f7c6fd1c72724510e624"} Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.412278 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-flgmc" event={"ID":"39c4f31f-248a-42ad-9aa0-166f036be3ac","Type":"ContainerStarted","Data":"0771e39d8dab411671cfc4649605f54dec52d0175be13158fbc6ec39af02072d"} Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.413571 5072 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-dcccb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.413710 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-dcccb" podUID="3b41acd5-2f6b-48fd-a9ba-55796e6db653" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.414322 5072 patch_prober.go:28] interesting pod/console-operator-58897d9998-49m8v container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.414357 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-49m8v" podUID="c36ce709-c726-4390-abb9-2ebcaecbf1c0" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.445519 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:17 crc kubenswrapper[5072]: E0228 04:13:17.451021 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:17.950990054 +0000 UTC m=+219.945720256 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.466700 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:17 crc kubenswrapper[5072]: E0228 04:13:17.468187 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:17.968163517 +0000 UTC m=+219.962893709 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.575834 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:17 crc kubenswrapper[5072]: E0228 04:13:17.577476 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:18.077439569 +0000 UTC m=+220.072169761 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.579494 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:17 crc kubenswrapper[5072]: E0228 04:13:17.611625 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:18.111606459 +0000 UTC m=+220.106336721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:17 crc kubenswrapper[5072]: W0228 04:13:17.641378 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod672db961_8de6_46ec_9dd8_5d2ef7572eef.slice/crio-271896e15cee489fb798be9866c6c0cdef8beec0996c7eea0cb7c713c702bf35 WatchSource:0}: Error finding container 271896e15cee489fb798be9866c6c0cdef8beec0996c7eea0cb7c713c702bf35: Status 404 returned error can't find the container with id 271896e15cee489fb798be9866c6c0cdef8beec0996c7eea0cb7c713c702bf35 Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.689108 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:17 crc kubenswrapper[5072]: E0228 04:13:17.689421 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:18.189404125 +0000 UTC m=+220.184134317 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:17 crc kubenswrapper[5072]: W0228 04:13:17.698579 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode440ea37_6f17_4863_a958_9e4b9debe3e3.slice/crio-36a8e0283c305c832f40145f42131e6ad5f738a2c9d42656021a41ad36899436 WatchSource:0}: Error finding container 36a8e0283c305c832f40145f42131e6ad5f738a2c9d42656021a41ad36899436: Status 404 returned error can't find the container with id 36a8e0283c305c832f40145f42131e6ad5f738a2c9d42656021a41ad36899436 Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.719092 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hxqzd"] Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.790090 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:17 crc kubenswrapper[5072]: E0228 04:13:17.790418 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:18.29040411 +0000 UTC m=+220.285134302 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.891870 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:17 crc kubenswrapper[5072]: E0228 04:13:17.892374 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:18.392348734 +0000 UTC m=+220.387078926 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.892505 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:17 crc kubenswrapper[5072]: E0228 04:13:17.892888 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:18.392877861 +0000 UTC m=+220.387608093 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.903440 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j55ft" Feb 28 04:13:17 crc kubenswrapper[5072]: I0228 04:13:17.992933 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:17 crc kubenswrapper[5072]: E0228 04:13:17.993795 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:18.493775353 +0000 UTC m=+220.488505545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.013711 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j55ft" podStartSLOduration=174.013693391 podStartE2EDuration="2m54.013693391s" podCreationTimestamp="2026-02-28 04:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:18.013325599 +0000 UTC m=+220.008055791" watchObservedRunningTime="2026-02-28 04:13:18.013693391 +0000 UTC m=+220.008423583" Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.014581 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tdnkt" podStartSLOduration=175.014572388 podStartE2EDuration="2m55.014572388s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:17.972508583 +0000 UTC m=+219.967238775" watchObservedRunningTime="2026-02-28 04:13:18.014572388 +0000 UTC m=+220.009302580" Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.057751 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" podStartSLOduration=175.057730578 podStartE2EDuration="2m55.057730578s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:18.055415106 +0000 UTC m=+220.050145318" watchObservedRunningTime="2026-02-28 04:13:18.057730578 +0000 UTC m=+220.052460770" Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.092778 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrbwn" podStartSLOduration=175.092761896 podStartE2EDuration="2m55.092761896s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:18.09000181 +0000 UTC m=+220.084732002" watchObservedRunningTime="2026-02-28 04:13:18.092761896 +0000 UTC m=+220.087492088" Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.098193 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:18 crc kubenswrapper[5072]: E0228 04:13:18.098589 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:18.598575555 +0000 UTC m=+220.593305747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.186253 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-9vfgz" podStartSLOduration=175.186178465 podStartE2EDuration="2m55.186178465s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:18.184156222 +0000 UTC m=+220.178886414" watchObservedRunningTime="2026-02-28 04:13:18.186178465 +0000 UTC m=+220.180908657" Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.201701 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:18 crc kubenswrapper[5072]: E0228 04:13:18.202252 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:18.702232553 +0000 UTC m=+220.696962745 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.224335 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t44gr" podStartSLOduration=175.224303819 podStartE2EDuration="2m55.224303819s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:18.144039696 +0000 UTC m=+220.138769888" watchObservedRunningTime="2026-02-28 04:13:18.224303819 +0000 UTC m=+220.219034021" Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.224827 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-dcccb" podStartSLOduration=175.224820984 podStartE2EDuration="2m55.224820984s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:18.205969119 +0000 UTC m=+220.200699321" watchObservedRunningTime="2026-02-28 04:13:18.224820984 +0000 UTC m=+220.219551176" Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.285855 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbm42" podStartSLOduration=175.285838299 podStartE2EDuration="2m55.285838299s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:18.285144717 +0000 UTC m=+220.279874919" watchObservedRunningTime="2026-02-28 04:13:18.285838299 +0000 UTC m=+220.280568491" Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.295902 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gf64p" podStartSLOduration=175.29587378 podStartE2EDuration="2m55.29587378s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:18.249943214 +0000 UTC m=+220.244673416" watchObservedRunningTime="2026-02-28 04:13:18.29587378 +0000 UTC m=+220.290603982" Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.303437 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:18 crc kubenswrapper[5072]: E0228 04:13:18.303810 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:18.803795876 +0000 UTC m=+220.798526078 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.401365 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-49m8v" podStartSLOduration=175.401347154 podStartE2EDuration="2m55.401347154s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:18.400202139 +0000 UTC m=+220.394932331" watchObservedRunningTime="2026-02-28 04:13:18.401347154 +0000 UTC m=+220.396077346" Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.405095 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:18 crc kubenswrapper[5072]: E0228 04:13:18.405234 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:18.905205824 +0000 UTC m=+220.899936026 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.405408 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:18 crc kubenswrapper[5072]: E0228 04:13:18.405745 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:18.905735 +0000 UTC m=+220.900465192 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.414836 5072 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-4d9s6 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.414908 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" podUID="497b9208-4958-46e8-8aeb-8bc2e0f172d6" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.419118 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-4qfjg" event={"ID":"908db76f-57be-45ae-9ebe-40d4e707fc5c","Type":"ContainerStarted","Data":"e7f4262f0a843cf4b608e67016a13edffef764ad1e3139111505c244ccff9292"} Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.421770 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zc5mk" event={"ID":"672db961-8de6-46ec-9dd8-5d2ef7572eef","Type":"ContainerStarted","Data":"271896e15cee489fb798be9866c6c0cdef8beec0996c7eea0cb7c713c702bf35"} Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.423207 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dsp7s" event={"ID":"e440ea37-6f17-4863-a958-9e4b9debe3e3","Type":"ContainerStarted","Data":"36a8e0283c305c832f40145f42131e6ad5f738a2c9d42656021a41ad36899436"} Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.430329 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-9wdtp" event={"ID":"49976b40-ea77-4857-a99f-f4a65df82e05","Type":"ContainerStarted","Data":"b4128e2021c10f504b09d63051f9705b94fe0163b162e6df03b0995e23b58de9"} Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.430436 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b24"] Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.434963 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbf7l" event={"ID":"bdabe9ca-055c-48ec-96c3-fb1ab2a342d5","Type":"ContainerStarted","Data":"c404ae18550560b9ad12d236b2415b3795aec5eb44578443c14bb386b58f8630"} Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.443932 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hxqzd" event={"ID":"58b4223e-e786-4022-a470-4b3c6aa754dd","Type":"ContainerStarted","Data":"d27346e4b44791cb15bade7cea659f159eaa392e7ceab5ee616d284c4cb97b94"} Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.446601 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" podStartSLOduration=175.446586888 podStartE2EDuration="2m55.446586888s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:18.444807303 +0000 UTC m=+220.439537505" watchObservedRunningTime="2026-02-28 04:13:18.446586888 +0000 UTC m=+220.441317080" Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.449370 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-dsh4l" event={"ID":"281cab3a-f295-45d8-90f6-7e010d5daea5","Type":"ContainerStarted","Data":"91a10accd9b254b441ca286ee98a7dac17bb94ff143ada18c3402e4f45da52ce"} Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.460371 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" event={"ID":"c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8","Type":"ContainerStarted","Data":"df10d77e27ea88183484871baa7c77d57666d28107e373751364f10ef168180f"} Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.461301 5072 patch_prober.go:28] interesting pod/console-operator-58897d9998-49m8v container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.461337 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-49m8v" podUID="c36ce709-c726-4390-abb9-2ebcaecbf1c0" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.470788 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-x4ltm"] Feb 28 04:13:18 crc kubenswrapper[5072]: W0228 04:13:18.481577 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ac6ac5f_443a_4b4f_88df_eaa38b4fe8ec.slice/crio-e1bbc5d2a523d78472ca07bcc996c4d64188b10a655c8d18fb9a6a55e2eff7c8 WatchSource:0}: Error finding container e1bbc5d2a523d78472ca07bcc996c4d64188b10a655c8d18fb9a6a55e2eff7c8: Status 404 returned error can't find the container with id e1bbc5d2a523d78472ca07bcc996c4d64188b10a655c8d18fb9a6a55e2eff7c8 Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.483359 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4lstm"] Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.498535 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-58vm7" podStartSLOduration=175.49851268 podStartE2EDuration="2m55.49851268s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:18.494742293 +0000 UTC m=+220.489472495" watchObservedRunningTime="2026-02-28 04:13:18.49851268 +0000 UTC m=+220.493242882" Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.504395 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8p4zs"] Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.506170 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:18 crc kubenswrapper[5072]: E0228 04:13:18.506274 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:19.006259311 +0000 UTC m=+221.000989493 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.506678 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:18 crc kubenswrapper[5072]: E0228 04:13:18.508416 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:19.008407937 +0000 UTC m=+221.003138129 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.547573 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wl8mf"] Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.552339 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537532-qwbxr"] Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.581854 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-mzmcb" podStartSLOduration=175.581835677 podStartE2EDuration="2m55.581835677s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:18.575142568 +0000 UTC m=+220.569872760" watchObservedRunningTime="2026-02-28 04:13:18.581835677 +0000 UTC m=+220.576565869" Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.584713 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-md7mn"] Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.591571 5072 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.610365 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:18 crc kubenswrapper[5072]: E0228 04:13:18.610861 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:19.110845737 +0000 UTC m=+221.105575919 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.613944 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-9wdtp" Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.617820 5072 patch_prober.go:28] interesting pod/router-default-5444994796-9wdtp container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.617911 5072 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9wdtp" podUID="49976b40-ea77-4857-a99f-f4a65df82e05" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.627102 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-flgmc" podStartSLOduration=175.627076821 podStartE2EDuration="2m55.627076821s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:18.625179071 +0000 UTC m=+220.619909263" watchObservedRunningTime="2026-02-28 04:13:18.627076821 +0000 UTC m=+220.621807013" Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.680566 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-bsswj" podStartSLOduration=175.680541671 podStartE2EDuration="2m55.680541671s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:18.665500573 +0000 UTC m=+220.660230765" watchObservedRunningTime="2026-02-28 04:13:18.680541671 +0000 UTC m=+220.675271853" Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.708577 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537520-d72cz"] Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.710049 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-64249"] Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.712716 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:18 crc kubenswrapper[5072]: E0228 04:13:18.713224 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:19.213202454 +0000 UTC m=+221.207932646 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.728199 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hklhr"] Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.743701 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-jp259"] Feb 28 04:13:18 crc kubenswrapper[5072]: W0228 04:13:18.752002 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f7f662b_a78e_4292_871f_e1c6cbfca641.slice/crio-34bce174e79c48c2d296a2093859117f8bad38478dd304ccd4c7620a21763205 WatchSource:0}: Error finding container 34bce174e79c48c2d296a2093859117f8bad38478dd304ccd4c7620a21763205: Status 404 returned error can't find the container with id 34bce174e79c48c2d296a2093859117f8bad38478dd304ccd4c7620a21763205 Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.765115 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-stshz"] Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.769315 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-mvt85"] Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.771422 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6htld"] Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.772404 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4r96" podStartSLOduration=175.772391042 podStartE2EDuration="2m55.772391042s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:18.706824556 +0000 UTC m=+220.701554748" watchObservedRunningTime="2026-02-28 04:13:18.772391042 +0000 UTC m=+220.767121234" Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.788510 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wnkn9"] Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.789375 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kmrvp" podStartSLOduration=175.789363808 podStartE2EDuration="2m55.789363808s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:18.733022579 +0000 UTC m=+220.727752791" watchObservedRunningTime="2026-02-28 04:13:18.789363808 +0000 UTC m=+220.784094000" Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.813202 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:18 crc kubenswrapper[5072]: E0228 04:13:18.813540 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:19.313523558 +0000 UTC m=+221.308253750 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:18 crc kubenswrapper[5072]: W0228 04:13:18.813595 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61d5e54f_0eeb_4b59_9511_cdf807911640.slice/crio-e5f7d4bc049ecb94efc6d90b1997c58a3db995fb8d531f9c35f86aa56e4bf2fd WatchSource:0}: Error finding container e5f7d4bc049ecb94efc6d90b1997c58a3db995fb8d531f9c35f86aa56e4bf2fd: Status 404 returned error can't find the container with id e5f7d4bc049ecb94efc6d90b1997c58a3db995fb8d531f9c35f86aa56e4bf2fd Feb 28 04:13:18 crc kubenswrapper[5072]: W0228 04:13:18.815238 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0caa7e2d_d3a6_4df8_a734_0d607ec64f5c.slice/crio-0ef6b917ec3a1f23eb4eea5bc441fe6ff05bbdda96c6928e48f379f40c81c383 WatchSource:0}: Error finding container 0ef6b917ec3a1f23eb4eea5bc441fe6ff05bbdda96c6928e48f379f40c81c383: Status 404 returned error can't find the container with id 0ef6b917ec3a1f23eb4eea5bc441fe6ff05bbdda96c6928e48f379f40c81c383 Feb 28 04:13:18 crc kubenswrapper[5072]: W0228 04:13:18.829023 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51561c1c_e376_4dc6_9429_b1bc39a54988.slice/crio-1e173ae493c8d3a6bc60e8ac324c7365516984e5fad41c6102e269e9b5a28d8c WatchSource:0}: Error finding container 1e173ae493c8d3a6bc60e8ac324c7365516984e5fad41c6102e269e9b5a28d8c: Status 404 returned error can't find the container with id 1e173ae493c8d3a6bc60e8ac324c7365516984e5fad41c6102e269e9b5a28d8c Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.840131 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-4qfjg" podStartSLOduration=5.840113114 podStartE2EDuration="5.840113114s" podCreationTimestamp="2026-02-28 04:13:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:18.839702711 +0000 UTC m=+220.834432923" watchObservedRunningTime="2026-02-28 04:13:18.840113114 +0000 UTC m=+220.834843296" Feb 28 04:13:18 crc kubenswrapper[5072]: I0228 04:13:18.914274 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:18 crc kubenswrapper[5072]: E0228 04:13:18.914582 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:19.414571995 +0000 UTC m=+221.409302187 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.015489 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:19 crc kubenswrapper[5072]: E0228 04:13:19.016289 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:19.516263702 +0000 UTC m=+221.510993894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.095487 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbf7l" podStartSLOduration=176.09546606 podStartE2EDuration="2m56.09546606s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:19.094593203 +0000 UTC m=+221.089323395" watchObservedRunningTime="2026-02-28 04:13:19.09546606 +0000 UTC m=+221.090196252" Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.117229 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:19 crc kubenswrapper[5072]: E0228 04:13:19.118199 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:19.618187825 +0000 UTC m=+221.612918017 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.186204 5072 ???:1] "http: TLS handshake error from 192.168.126.11:60946: no serving certificate available for the kubelet" Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.220261 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:19 crc kubenswrapper[5072]: E0228 04:13:19.220737 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:19.720721808 +0000 UTC m=+221.715452000 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.276852 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-9wdtp" podStartSLOduration=176.276795679 podStartE2EDuration="2m56.276795679s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:19.267202891 +0000 UTC m=+221.261933083" watchObservedRunningTime="2026-02-28 04:13:19.276795679 +0000 UTC m=+221.271525871" Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.286890 5072 ???:1] "http: TLS handshake error from 192.168.126.11:60962: no serving certificate available for the kubelet" Feb 28 04:13:19 crc kubenswrapper[5072]: E0228 04:13:19.322853 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:19.822838958 +0000 UTC m=+221.817569150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.322483 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.399804 5072 ???:1] "http: TLS handshake error from 192.168.126.11:60974: no serving certificate available for the kubelet" Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.439619 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:19 crc kubenswrapper[5072]: E0228 04:13:19.442720 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:19.942685368 +0000 UTC m=+221.937415570 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.444305 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:19 crc kubenswrapper[5072]: E0228 04:13:19.448877 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:19.94885913 +0000 UTC m=+221.943589322 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.542449 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.542846 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.546093 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:19 crc kubenswrapper[5072]: E0228 04:13:19.546825 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:20.04680899 +0000 UTC m=+222.041539182 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.561102 5072 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-qvn7m container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.21:8443/livez\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.561165 5072 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" podUID="b1e546a4-808d-4c86-a8ab-274186b278a6" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.21:8443/livez\": dial tcp 10.217.0.21:8443: connect: connection refused" Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.583616 5072 ???:1] "http: TLS handshake error from 192.168.126.11:60978: no serving certificate available for the kubelet" Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.587262 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wnkn9" event={"ID":"ac478b24-16b1-460c-97d2-6be88f80ed94","Type":"ContainerStarted","Data":"8a729b522f1fc946999194eab0c64dfda02c5c00cf355ba6eb5e062b89c16135"} Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.594204 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nglpx" event={"ID":"f4003ab4-f058-48d2-835e-e50ecd38cebb","Type":"ContainerStarted","Data":"18e1de1758f73731315c4b3cfc2e59d1cfee5e91cf16f9d769dc2d929f82eb37"} Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.606951 5072 patch_prober.go:28] interesting pod/router-default-5444994796-9wdtp container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.607292 5072 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9wdtp" podUID="49976b40-ea77-4857-a99f-f4a65df82e05" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.630260 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b24" event={"ID":"d8d6896e-b13d-4c3b-b0e6-7feb24127794","Type":"ContainerStarted","Data":"0c0460f8d26893b495190d5fa5cb23d4e21ee782896d9b984e852d5d9a9b9d2d"} Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.630403 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b24" event={"ID":"d8d6896e-b13d-4c3b-b0e6-7feb24127794","Type":"ContainerStarted","Data":"30ef721684f067775b7a89876af3a13297bd7abec0af5f7fd1257e71bbcf142a"} Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.631450 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b24" Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.641835 5072 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-q7b24 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" start-of-body= Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.642011 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b24" podUID="d8d6896e-b13d-4c3b-b0e6-7feb24127794" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.647514 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-md7mn" event={"ID":"5a1b591b-8355-4747-8858-baf81ce6928d","Type":"ContainerStarted","Data":"f961a2d1d9a3e2508a6255a1381391d04aac8ee3e6a6f9a5762c72ce5b15d79d"} Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.647655 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-md7mn" event={"ID":"5a1b591b-8355-4747-8858-baf81ce6928d","Type":"ContainerStarted","Data":"28d0ce22c6190f04903087f330cbf06d542ca303cfdcce1b747bd7049f30f8d9"} Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.648667 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:19 crc kubenswrapper[5072]: E0228 04:13:19.649014 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:20.148999523 +0000 UTC m=+222.143729715 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.654492 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zc5mk" event={"ID":"672db961-8de6-46ec-9dd8-5d2ef7572eef","Type":"ContainerStarted","Data":"55f23b105b21232e1060706f8888f5073fe20aea1f5cc284bd3712dcf7908207"} Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.657994 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-x4ltm" event={"ID":"9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec","Type":"ContainerStarted","Data":"e1bbc5d2a523d78472ca07bcc996c4d64188b10a655c8d18fb9a6a55e2eff7c8"} Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.672058 5072 ???:1] "http: TLS handshake error from 192.168.126.11:50392: no serving certificate available for the kubelet" Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.683106 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6htld" event={"ID":"c329b662-2d9c-4a89-b244-76d7cd9c5c5c","Type":"ContainerStarted","Data":"7f6acbfe95b3e2c6f1d0873a6a514686ec08bba49f585868c772e201fb98cc07"} Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.683149 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6htld" event={"ID":"c329b662-2d9c-4a89-b244-76d7cd9c5c5c","Type":"ContainerStarted","Data":"cb8851b2799c949501ac16dd3eabfaf3dccce25fefe86f5945a39ecf6a612ff1"} Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.687574 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mvt85" event={"ID":"0caa7e2d-d3a6-4df8-a734-0d607ec64f5c","Type":"ContainerStarted","Data":"0ef6b917ec3a1f23eb4eea5bc441fe6ff05bbdda96c6928e48f379f40c81c383"} Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.701985 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-jp259" event={"ID":"7f7f662b-a78e-4292-871f-e1c6cbfca641","Type":"ContainerStarted","Data":"46abc42f4e4693e1787c22dcd0b2a46051b1960834b371c0d81555a0efb3e829"} Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.702056 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-jp259" event={"ID":"7f7f662b-a78e-4292-871f-e1c6cbfca641","Type":"ContainerStarted","Data":"34bce174e79c48c2d296a2093859117f8bad38478dd304ccd4c7620a21763205"} Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.740837 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hklhr" event={"ID":"51561c1c-e376-4dc6-9429-b1bc39a54988","Type":"ContainerStarted","Data":"1e173ae493c8d3a6bc60e8ac324c7365516984e5fad41c6102e269e9b5a28d8c"} Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.760940 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6htld" podStartSLOduration=175.760905716 podStartE2EDuration="2m55.760905716s" podCreationTimestamp="2026-02-28 04:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:19.744391524 +0000 UTC m=+221.739121716" watchObservedRunningTime="2026-02-28 04:13:19.760905716 +0000 UTC m=+221.755635908" Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.761629 5072 ???:1] "http: TLS handshake error from 192.168.126.11:50402: no serving certificate available for the kubelet" Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.761869 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:19 crc kubenswrapper[5072]: E0228 04:13:19.767005 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:20.266974405 +0000 UTC m=+222.261704797 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.767864 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-stshz" event={"ID":"56c867eb-6fd4-476a-8317-9f590f2ff47a","Type":"ContainerStarted","Data":"4883f9f62ce5afb1f90f8d181961be5632d65b871527a1c6081bea01aabff85b"} Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.767914 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-stshz" event={"ID":"56c867eb-6fd4-476a-8317-9f590f2ff47a","Type":"ContainerStarted","Data":"2894b982f790fb3fb94bce3351c0c32ffd296a0da8a2350101f917c8dde4e894"} Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.775718 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-stshz" Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.800526 5072 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-stshz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.800600 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-stshz" podUID="56c867eb-6fd4-476a-8317-9f590f2ff47a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.800908 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537532-qwbxr" event={"ID":"c5d29ab8-044b-4fc5-b5eb-02c5ac608dac","Type":"ContainerStarted","Data":"96d62b60438b0c5446e965bc0dc000febacf642171abf1c16bf9b8c8c15657b7"} Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.833251 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8p4zs" event={"ID":"96f3be9c-c37e-4f19-b218-a9e6f8461d02","Type":"ContainerStarted","Data":"057b07bbe8758995e64092f2118f41450b0abc83aa5d8ed60a029f518fe11d01"} Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.833302 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8p4zs" event={"ID":"96f3be9c-c37e-4f19-b218-a9e6f8461d02","Type":"ContainerStarted","Data":"720e470b6e81cc3304b8c52a4beb648b7334d67492f0a2b28e7138e71b9e34fe"} Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.837563 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wl8mf" event={"ID":"630fc50d-827c-4e60-87ad-aa5012ecbcd8","Type":"ContainerStarted","Data":"f2858e542ce6b151d7ca49e55a8c962ed9002109986011dfbea7770742e334e1"} Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.837627 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wl8mf" event={"ID":"630fc50d-827c-4e60-87ad-aa5012ecbcd8","Type":"ContainerStarted","Data":"ecb03983108ccf7a026aac1420805f1fc3b7d904e8214a8e469998a866efe9a9"} Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.840004 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n59st" event={"ID":"78753d91-e30f-43df-8c10-5f8a978a755f","Type":"ContainerStarted","Data":"ffc02f98652c1283b82f08a81900f578c6a9f0892dad83ba253e04047d98f23f"} Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.841270 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-md7mn" podStartSLOduration=6.8412548399999995 podStartE2EDuration="6.84125484s" podCreationTimestamp="2026-02-28 04:13:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:19.80032283 +0000 UTC m=+221.795053012" watchObservedRunningTime="2026-02-28 04:13:19.84125484 +0000 UTC m=+221.835985052" Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.845726 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zc5mk" podStartSLOduration=176.845707718 podStartE2EDuration="2m56.845707718s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:19.845652476 +0000 UTC m=+221.840382668" watchObservedRunningTime="2026-02-28 04:13:19.845707718 +0000 UTC m=+221.840437920" Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.857591 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hxqzd" event={"ID":"58b4223e-e786-4022-a470-4b3c6aa754dd","Type":"ContainerStarted","Data":"365c93c29723dc8e1c32c5d366cf039445c97fb8b0b1e9fe8a149dfc54a37250"} Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.858038 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hxqzd" Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.862612 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4lstm" event={"ID":"3644bb59-5f48-4f4d-b244-d5128487ff5f","Type":"ContainerStarted","Data":"e1c22a65dbedb0f51c1c00ae640e6295b48804d9c788ee65df5e9d5a18d1bf3c"} Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.862682 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4lstm" event={"ID":"3644bb59-5f48-4f4d-b244-d5128487ff5f","Type":"ContainerStarted","Data":"2ae4003ca4040c4128160001755ed0f0547d024b80e2b5a36e392ca58c3bb7a6"} Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.863523 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4lstm" Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.864511 5072 ???:1] "http: TLS handshake error from 192.168.126.11:50406: no serving certificate available for the kubelet" Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.865178 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:19 crc kubenswrapper[5072]: E0228 04:13:19.866459 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:20.366446272 +0000 UTC m=+222.361176464 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.868845 5072 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-hxqzd container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.868936 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hxqzd" podUID="58b4223e-e786-4022-a470-4b3c6aa754dd" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.874158 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dsp7s" event={"ID":"e440ea37-6f17-4863-a958-9e4b9debe3e3","Type":"ContainerStarted","Data":"95bd16ed84ee9dd43502c67c865da4d97c626c060751710693de85774e95e97e"} Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.876168 5072 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-4lstm container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.876256 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4lstm" podUID="3644bb59-5f48-4f4d-b244-d5128487ff5f" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.883576 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-x4ltm" podStartSLOduration=176.883560974 podStartE2EDuration="2m56.883560974s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:19.881242292 +0000 UTC m=+221.875972504" watchObservedRunningTime="2026-02-28 04:13:19.883560974 +0000 UTC m=+221.878291166" Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.893600 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-64249" event={"ID":"61d5e54f-0eeb-4b59-9511-cdf807911640","Type":"ContainerStarted","Data":"e5f7d4bc049ecb94efc6d90b1997c58a3db995fb8d531f9c35f86aa56e4bf2fd"} Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.909919 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-d72cz" event={"ID":"bd0cadda-3f93-43ac-b288-7e666a7f1b99","Type":"ContainerStarted","Data":"0452223c3b5f745fffaf5b0397b4c77817e783f3dbbd83c1b854cbf448faa7a1"} Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.909980 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-d72cz" event={"ID":"bd0cadda-3f93-43ac-b288-7e666a7f1b99","Type":"ContainerStarted","Data":"52df5c98b66a8e0a7ba12320d508fd3dfd88109d11377080cc4fc1b7ea6dfc5d"} Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.949028 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b24" podStartSLOduration=176.949011776 podStartE2EDuration="2m56.949011776s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:19.911770799 +0000 UTC m=+221.906501001" watchObservedRunningTime="2026-02-28 04:13:19.949011776 +0000 UTC m=+221.943741958" Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.968455 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:19 crc kubenswrapper[5072]: E0228 04:13:19.969346 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:20.469330686 +0000 UTC m=+222.464060878 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:19 crc kubenswrapper[5072]: I0228 04:13:19.984150 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nglpx" podStartSLOduration=176.984131255 podStartE2EDuration="2m56.984131255s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:19.95015184 +0000 UTC m=+221.944882032" watchObservedRunningTime="2026-02-28 04:13:19.984131255 +0000 UTC m=+221.978861447" Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.017597 5072 ???:1] "http: TLS handshake error from 192.168.126.11:50418: no serving certificate available for the kubelet" Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.017719 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wl8mf" podStartSLOduration=177.017688477 podStartE2EDuration="2m57.017688477s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:19.983127054 +0000 UTC m=+221.977857246" watchObservedRunningTime="2026-02-28 04:13:20.017688477 +0000 UTC m=+222.012418669" Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.018695 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" podStartSLOduration=177.018689468 podStartE2EDuration="2m57.018689468s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:20.017916474 +0000 UTC m=+222.012646666" watchObservedRunningTime="2026-02-28 04:13:20.018689468 +0000 UTC m=+222.013419660" Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.075466 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:20 crc kubenswrapper[5072]: E0228 04:13:20.076344 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:20.576174352 +0000 UTC m=+222.570904544 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.108316 5072 patch_prober.go:28] interesting pod/machine-config-daemon-5lrpf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.108374 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.108938 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4lstm" podStartSLOduration=177.108915278 podStartE2EDuration="2m57.108915278s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:20.108737053 +0000 UTC m=+222.103467245" watchObservedRunningTime="2026-02-28 04:13:20.108915278 +0000 UTC m=+222.103645470" Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.110908 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n59st" podStartSLOduration=177.110900421 podStartE2EDuration="2m57.110900421s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:20.064656485 +0000 UTC m=+222.059386677" watchObservedRunningTime="2026-02-28 04:13:20.110900421 +0000 UTC m=+222.105630613" Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.179914 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.173386 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-d72cz" podStartSLOduration=177.173371279 podStartE2EDuration="2m57.173371279s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:20.173174393 +0000 UTC m=+222.167904585" watchObservedRunningTime="2026-02-28 04:13:20.173371279 +0000 UTC m=+222.168101471" Feb 28 04:13:20 crc kubenswrapper[5072]: E0228 04:13:20.180301 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:20.680282894 +0000 UTC m=+222.675013086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.252702 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-jp259" podStartSLOduration=176.252684631 podStartE2EDuration="2m56.252684631s" podCreationTimestamp="2026-02-28 04:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:20.25201078 +0000 UTC m=+222.246740962" watchObservedRunningTime="2026-02-28 04:13:20.252684631 +0000 UTC m=+222.247414823" Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.254945 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-stshz" podStartSLOduration=177.254939031 podStartE2EDuration="2m57.254939031s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:20.204231787 +0000 UTC m=+222.198961999" watchObservedRunningTime="2026-02-28 04:13:20.254939031 +0000 UTC m=+222.249669223" Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.281528 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:20 crc kubenswrapper[5072]: E0228 04:13:20.281873 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:20.781859207 +0000 UTC m=+222.776589399 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.310142 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8p4zs" podStartSLOduration=177.310119825 podStartE2EDuration="2m57.310119825s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:20.298258426 +0000 UTC m=+222.292988618" watchObservedRunningTime="2026-02-28 04:13:20.310119825 +0000 UTC m=+222.304850017" Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.382528 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:20 crc kubenswrapper[5072]: E0228 04:13:20.382747 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:20.882715868 +0000 UTC m=+222.877446060 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.382844 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:20 crc kubenswrapper[5072]: E0228 04:13:20.383210 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:20.883194982 +0000 UTC m=+222.877925174 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.484147 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:20 crc kubenswrapper[5072]: E0228 04:13:20.484340 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:20.984313081 +0000 UTC m=+222.979043273 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.484734 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:20 crc kubenswrapper[5072]: E0228 04:13:20.485055 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:20.985045834 +0000 UTC m=+222.979776026 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.530107 5072 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-l4r96 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.530185 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4r96" podUID="37da02ec-0d39-471d-93ea-5cca2236656d" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.530120 5072 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-l4r96 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.530282 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4r96" podUID="37da02ec-0d39-471d-93ea-5cca2236656d" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.586009 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:20 crc kubenswrapper[5072]: E0228 04:13:20.586310 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:21.086256286 +0000 UTC m=+223.080986478 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.586512 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:20 crc kubenswrapper[5072]: E0228 04:13:20.586906 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:21.086888086 +0000 UTC m=+223.081618278 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.616091 5072 patch_prober.go:28] interesting pod/router-default-5444994796-9wdtp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 04:13:20 crc kubenswrapper[5072]: [-]has-synced failed: reason withheld Feb 28 04:13:20 crc kubenswrapper[5072]: [+]process-running ok Feb 28 04:13:20 crc kubenswrapper[5072]: healthz check failed Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.616166 5072 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9wdtp" podUID="49976b40-ea77-4857-a99f-f4a65df82e05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.621951 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.686273 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hxqzd" podStartSLOduration=177.68625075 podStartE2EDuration="2m57.68625075s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:20.32898875 +0000 UTC m=+222.323718942" watchObservedRunningTime="2026-02-28 04:13:20.68625075 +0000 UTC m=+222.680980942" Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.688197 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:20 crc kubenswrapper[5072]: E0228 04:13:20.688849 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:21.188819189 +0000 UTC m=+223.183549391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.792915 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:20 crc kubenswrapper[5072]: E0228 04:13:20.794105 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:21.294086017 +0000 UTC m=+223.288816209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.804506 5072 ???:1] "http: TLS handshake error from 192.168.126.11:50432: no serving certificate available for the kubelet" Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.893870 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:20 crc kubenswrapper[5072]: E0228 04:13:20.894215 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:21.394199395 +0000 UTC m=+223.388929587 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.922057 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dsp7s" event={"ID":"e440ea37-6f17-4863-a958-9e4b9debe3e3","Type":"ContainerStarted","Data":"366a1f295a10f97abb70b94a2a149e4a52821123cae854567e0d217225dd4b48"} Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.935372 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-x4ltm" event={"ID":"9ac6ac5f-443a-4b4f-88df-eaa38b4fe8ec","Type":"ContainerStarted","Data":"ccf54e3935226934b44f92d56b4e26ea294be9bdc55a4c99700d4525513988f2"} Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.948422 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-64249" event={"ID":"61d5e54f-0eeb-4b59-9511-cdf807911640","Type":"ContainerStarted","Data":"0c97eb5101bbf724c3e7dddd9bd7b7763ca9b37ea4eb9ed99bc3b3d101d79cbf"} Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.948557 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-64249" event={"ID":"61d5e54f-0eeb-4b59-9511-cdf807911640","Type":"ContainerStarted","Data":"7d18c3c975a3e446b01a141d705c519fe062e5e29e0631ac468eb8de37965875"} Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.954556 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-dsh4l" event={"ID":"281cab3a-f295-45d8-90f6-7e010d5daea5","Type":"ContainerStarted","Data":"e091600d1178212e24b05c1f0473d9f58f6e1fc84b1a8a95243bca1c74da343e"} Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.979323 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wnkn9" event={"ID":"ac478b24-16b1-460c-97d2-6be88f80ed94","Type":"ContainerStarted","Data":"9908d0d79e826d9b924f3eaae06feb808770f69e5faf517fcc37047e697b8fcd"} Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.979371 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wnkn9" event={"ID":"ac478b24-16b1-460c-97d2-6be88f80ed94","Type":"ContainerStarted","Data":"78cdd67304cf946c0c84831fbf354048765527f1f7a32a5722d303415fefcdc0"} Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.980006 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-wnkn9" Feb 28 04:13:20 crc kubenswrapper[5072]: I0228 04:13:20.997201 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:20 crc kubenswrapper[5072]: E0228 04:13:20.997535 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:21.497522822 +0000 UTC m=+223.492253014 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.003582 5072 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-q7b24 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" start-of-body= Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.003655 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b24" podUID="d8d6896e-b13d-4c3b-b0e6-7feb24127794" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.003860 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hklhr" event={"ID":"51561c1c-e376-4dc6-9429-b1bc39a54988","Type":"ContainerStarted","Data":"cd49e3cceccffe5fc4d2265ebd5091c07aee1b49dc28831d6a02d6cbdb289ebb"} Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.003895 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hklhr" event={"ID":"51561c1c-e376-4dc6-9429-b1bc39a54988","Type":"ContainerStarted","Data":"f47571614200b06f7e80a4ecc277877217cdd95a5475c626deec3fce3cc5ea7c"} Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.004017 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hklhr" Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.004538 5072 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-4lstm container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.004605 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4lstm" podUID="3644bb59-5f48-4f4d-b244-d5128487ff5f" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.004789 5072 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-hxqzd container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.004806 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hxqzd" podUID="58b4223e-e786-4022-a470-4b3c6aa754dd" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.004881 5072 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-stshz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.004973 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-stshz" podUID="56c867eb-6fd4-476a-8317-9f590f2ff47a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.102078 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:21 crc kubenswrapper[5072]: E0228 04:13:21.102428 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:21.602407158 +0000 UTC m=+223.597137350 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.102509 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:21 crc kubenswrapper[5072]: E0228 04:13:21.107763 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:21.607744264 +0000 UTC m=+223.602474456 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.108067 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dsp7s" podStartSLOduration=178.108045713 podStartE2EDuration="2m58.108045713s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:20.996683676 +0000 UTC m=+222.991413878" watchObservedRunningTime="2026-02-28 04:13:21.108045713 +0000 UTC m=+223.102775905" Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.110226 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wnkn9" podStartSLOduration=8.11021093 podStartE2EDuration="8.11021093s" podCreationTimestamp="2026-02-28 04:13:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:21.102002625 +0000 UTC m=+223.096732817" watchObservedRunningTime="2026-02-28 04:13:21.11021093 +0000 UTC m=+223.104941122" Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.204187 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:21 crc kubenswrapper[5072]: E0228 04:13:21.204315 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:21.70429029 +0000 UTC m=+223.699020482 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.204502 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:21 crc kubenswrapper[5072]: E0228 04:13:21.204898 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:21.704886439 +0000 UTC m=+223.699616631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.206883 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-64249" podStartSLOduration=178.206847769 podStartE2EDuration="2m58.206847769s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:21.202404182 +0000 UTC m=+223.197134374" watchObservedRunningTime="2026-02-28 04:13:21.206847769 +0000 UTC m=+223.201577961" Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.261082 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-dsh4l" podStartSLOduration=178.261065323 podStartE2EDuration="2m58.261065323s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:21.259361029 +0000 UTC m=+223.254091221" watchObservedRunningTime="2026-02-28 04:13:21.261065323 +0000 UTC m=+223.255795515" Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.307149 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:21 crc kubenswrapper[5072]: E0228 04:13:21.307586 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:21.807567246 +0000 UTC m=+223.802297438 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.375609 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hklhr" podStartSLOduration=178.375591158 podStartE2EDuration="2m58.375591158s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:21.290771755 +0000 UTC m=+223.285501957" watchObservedRunningTime="2026-02-28 04:13:21.375591158 +0000 UTC m=+223.370321340" Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.377930 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dcccb"] Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.378125 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-dcccb" podUID="3b41acd5-2f6b-48fd-a9ba-55796e6db653" containerName="controller-manager" containerID="cri-o://67c2a17ca7a758951d1bff9a14bb12a1de825717f63d7a3d3f74a65fe2a48f2e" gracePeriod=30 Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.397859 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-dcccb" Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.408540 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:21 crc kubenswrapper[5072]: E0228 04:13:21.408897 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:21.908884611 +0000 UTC m=+223.903614803 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.516295 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:21 crc kubenswrapper[5072]: E0228 04:13:21.516916 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:22.016895924 +0000 UTC m=+224.011626116 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.613391 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-j55ft"] Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.613631 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j55ft" podUID="c7d7baf9-5adb-4cc1-a07b-d093bcd78e30" containerName="route-controller-manager" containerID="cri-o://cdbeeed7d6697f3f1580ffc4f26dd1a3313486578e66be025ddb8d5720b6a621" gracePeriod=30 Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.614874 5072 patch_prober.go:28] interesting pod/router-default-5444994796-9wdtp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 04:13:21 crc kubenswrapper[5072]: [-]has-synced failed: reason withheld Feb 28 04:13:21 crc kubenswrapper[5072]: [+]process-running ok Feb 28 04:13:21 crc kubenswrapper[5072]: healthz check failed Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.614968 5072 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9wdtp" podUID="49976b40-ea77-4857-a99f-f4a65df82e05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.617657 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:21 crc kubenswrapper[5072]: E0228 04:13:21.618000 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:22.117988492 +0000 UTC m=+224.112718684 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.718576 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:21 crc kubenswrapper[5072]: E0228 04:13:21.718824 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:22.21877658 +0000 UTC m=+224.213506782 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.718912 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:21 crc kubenswrapper[5072]: E0228 04:13:21.719287 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:22.219276576 +0000 UTC m=+224.214006788 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.820058 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:21 crc kubenswrapper[5072]: E0228 04:13:21.820359 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:22.320323683 +0000 UTC m=+224.315053885 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.820873 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:21 crc kubenswrapper[5072]: E0228 04:13:21.821467 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:22.321458988 +0000 UTC m=+224.316189170 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:21 crc kubenswrapper[5072]: I0228 04:13:21.922246 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:21 crc kubenswrapper[5072]: E0228 04:13:21.922708 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:22.422686911 +0000 UTC m=+224.417417103 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.023457 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:22 crc kubenswrapper[5072]: E0228 04:13:22.026907 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:22.526892535 +0000 UTC m=+224.521622727 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.056170 5072 generic.go:334] "Generic (PLEG): container finished" podID="3b41acd5-2f6b-48fd-a9ba-55796e6db653" containerID="67c2a17ca7a758951d1bff9a14bb12a1de825717f63d7a3d3f74a65fe2a48f2e" exitCode=0 Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.056267 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dcccb" event={"ID":"3b41acd5-2f6b-48fd-a9ba-55796e6db653","Type":"ContainerDied","Data":"67c2a17ca7a758951d1bff9a14bb12a1de825717f63d7a3d3f74a65fe2a48f2e"} Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.066019 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mvt85" event={"ID":"0caa7e2d-d3a6-4df8-a734-0d607ec64f5c","Type":"ContainerStarted","Data":"2aa718575f5729346b9d6a9e8dc9476b8ddfca0705c150f693eed3a1382ce685"} Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.067757 5072 generic.go:334] "Generic (PLEG): container finished" podID="c7d7baf9-5adb-4cc1-a07b-d093bcd78e30" containerID="cdbeeed7d6697f3f1580ffc4f26dd1a3313486578e66be025ddb8d5720b6a621" exitCode=0 Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.068475 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j55ft" event={"ID":"c7d7baf9-5adb-4cc1-a07b-d093bcd78e30","Type":"ContainerDied","Data":"cdbeeed7d6697f3f1580ffc4f26dd1a3313486578e66be025ddb8d5720b6a621"} Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.069317 5072 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-stshz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.069395 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-stshz" podUID="56c867eb-6fd4-476a-8317-9f590f2ff47a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.109287 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4lstm" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.131533 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:22 crc kubenswrapper[5072]: E0228 04:13:22.133378 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:22.633349559 +0000 UTC m=+224.628079791 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.138596 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dcccb" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.162952 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j55ft" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.225093 5072 ???:1] "http: TLS handshake error from 192.168.126.11:50444: no serving certificate available for the kubelet" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.238010 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:22 crc kubenswrapper[5072]: E0228 04:13:22.238876 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:22.738854435 +0000 UTC m=+224.733584627 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.341064 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7d7baf9-5adb-4cc1-a07b-d093bcd78e30-serving-cert\") pod \"c7d7baf9-5adb-4cc1-a07b-d093bcd78e30\" (UID: \"c7d7baf9-5adb-4cc1-a07b-d093bcd78e30\") " Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.341128 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b41acd5-2f6b-48fd-a9ba-55796e6db653-serving-cert\") pod \"3b41acd5-2f6b-48fd-a9ba-55796e6db653\" (UID: \"3b41acd5-2f6b-48fd-a9ba-55796e6db653\") " Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.341181 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b41acd5-2f6b-48fd-a9ba-55796e6db653-client-ca\") pod \"3b41acd5-2f6b-48fd-a9ba-55796e6db653\" (UID: \"3b41acd5-2f6b-48fd-a9ba-55796e6db653\") " Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.341301 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.341340 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7d7baf9-5adb-4cc1-a07b-d093bcd78e30-config\") pod \"c7d7baf9-5adb-4cc1-a07b-d093bcd78e30\" (UID: \"c7d7baf9-5adb-4cc1-a07b-d093bcd78e30\") " Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.341362 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b41acd5-2f6b-48fd-a9ba-55796e6db653-config\") pod \"3b41acd5-2f6b-48fd-a9ba-55796e6db653\" (UID: \"3b41acd5-2f6b-48fd-a9ba-55796e6db653\") " Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.341383 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3b41acd5-2f6b-48fd-a9ba-55796e6db653-proxy-ca-bundles\") pod \"3b41acd5-2f6b-48fd-a9ba-55796e6db653\" (UID: \"3b41acd5-2f6b-48fd-a9ba-55796e6db653\") " Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.341447 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7d7baf9-5adb-4cc1-a07b-d093bcd78e30-client-ca\") pod \"c7d7baf9-5adb-4cc1-a07b-d093bcd78e30\" (UID: \"c7d7baf9-5adb-4cc1-a07b-d093bcd78e30\") " Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.341486 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdhtf\" (UniqueName: \"kubernetes.io/projected/c7d7baf9-5adb-4cc1-a07b-d093bcd78e30-kube-api-access-pdhtf\") pod \"c7d7baf9-5adb-4cc1-a07b-d093bcd78e30\" (UID: \"c7d7baf9-5adb-4cc1-a07b-d093bcd78e30\") " Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.341543 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7vf5\" (UniqueName: \"kubernetes.io/projected/3b41acd5-2f6b-48fd-a9ba-55796e6db653-kube-api-access-x7vf5\") pod \"3b41acd5-2f6b-48fd-a9ba-55796e6db653\" (UID: \"3b41acd5-2f6b-48fd-a9ba-55796e6db653\") " Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.343410 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7d7baf9-5adb-4cc1-a07b-d093bcd78e30-config" (OuterVolumeSpecName: "config") pod "c7d7baf9-5adb-4cc1-a07b-d093bcd78e30" (UID: "c7d7baf9-5adb-4cc1-a07b-d093bcd78e30"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.345181 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b41acd5-2f6b-48fd-a9ba-55796e6db653-client-ca" (OuterVolumeSpecName: "client-ca") pod "3b41acd5-2f6b-48fd-a9ba-55796e6db653" (UID: "3b41acd5-2f6b-48fd-a9ba-55796e6db653"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.348393 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b41acd5-2f6b-48fd-a9ba-55796e6db653-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3b41acd5-2f6b-48fd-a9ba-55796e6db653" (UID: "3b41acd5-2f6b-48fd-a9ba-55796e6db653"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.349070 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b41acd5-2f6b-48fd-a9ba-55796e6db653-config" (OuterVolumeSpecName: "config") pod "3b41acd5-2f6b-48fd-a9ba-55796e6db653" (UID: "3b41acd5-2f6b-48fd-a9ba-55796e6db653"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.353260 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b41acd5-2f6b-48fd-a9ba-55796e6db653-kube-api-access-x7vf5" (OuterVolumeSpecName: "kube-api-access-x7vf5") pod "3b41acd5-2f6b-48fd-a9ba-55796e6db653" (UID: "3b41acd5-2f6b-48fd-a9ba-55796e6db653"). InnerVolumeSpecName "kube-api-access-x7vf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.353963 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7d7baf9-5adb-4cc1-a07b-d093bcd78e30-client-ca" (OuterVolumeSpecName: "client-ca") pod "c7d7baf9-5adb-4cc1-a07b-d093bcd78e30" (UID: "c7d7baf9-5adb-4cc1-a07b-d093bcd78e30"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.354370 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7d7baf9-5adb-4cc1-a07b-d093bcd78e30-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c7d7baf9-5adb-4cc1-a07b-d093bcd78e30" (UID: "c7d7baf9-5adb-4cc1-a07b-d093bcd78e30"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:13:22 crc kubenswrapper[5072]: E0228 04:13:22.354488 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:22.854468943 +0000 UTC m=+224.849199135 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.357407 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-554bf44686-cvpls"] Feb 28 04:13:22 crc kubenswrapper[5072]: E0228 04:13:22.357671 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d7baf9-5adb-4cc1-a07b-d093bcd78e30" containerName="route-controller-manager" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.357684 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d7baf9-5adb-4cc1-a07b-d093bcd78e30" containerName="route-controller-manager" Feb 28 04:13:22 crc kubenswrapper[5072]: E0228 04:13:22.357702 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b41acd5-2f6b-48fd-a9ba-55796e6db653" containerName="controller-manager" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.357709 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b41acd5-2f6b-48fd-a9ba-55796e6db653" containerName="controller-manager" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.357802 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7d7baf9-5adb-4cc1-a07b-d093bcd78e30" containerName="route-controller-manager" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.357817 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b41acd5-2f6b-48fd-a9ba-55796e6db653" containerName="controller-manager" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.358244 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-554bf44686-cvpls" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.359423 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b41acd5-2f6b-48fd-a9ba-55796e6db653-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3b41acd5-2f6b-48fd-a9ba-55796e6db653" (UID: "3b41acd5-2f6b-48fd-a9ba-55796e6db653"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.363101 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7d7baf9-5adb-4cc1-a07b-d093bcd78e30-kube-api-access-pdhtf" (OuterVolumeSpecName: "kube-api-access-pdhtf") pod "c7d7baf9-5adb-4cc1-a07b-d093bcd78e30" (UID: "c7d7baf9-5adb-4cc1-a07b-d093bcd78e30"). InnerVolumeSpecName "kube-api-access-pdhtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.381926 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs"] Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.383375 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.389151 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-554bf44686-cvpls"] Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.405060 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs"] Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.444657 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.445435 5072 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7d7baf9-5adb-4cc1-a07b-d093bcd78e30-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.445505 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdhtf\" (UniqueName: \"kubernetes.io/projected/c7d7baf9-5adb-4cc1-a07b-d093bcd78e30-kube-api-access-pdhtf\") on node \"crc\" DevicePath \"\"" Feb 28 04:13:22 crc kubenswrapper[5072]: E0228 04:13:22.445780 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:22.945765137 +0000 UTC m=+224.940495329 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.445924 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7vf5\" (UniqueName: \"kubernetes.io/projected/3b41acd5-2f6b-48fd-a9ba-55796e6db653-kube-api-access-x7vf5\") on node \"crc\" DevicePath \"\"" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.445938 5072 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7d7baf9-5adb-4cc1-a07b-d093bcd78e30-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.445948 5072 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b41acd5-2f6b-48fd-a9ba-55796e6db653-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.445957 5072 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b41acd5-2f6b-48fd-a9ba-55796e6db653-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.445966 5072 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7d7baf9-5adb-4cc1-a07b-d093bcd78e30-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.445975 5072 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b41acd5-2f6b-48fd-a9ba-55796e6db653-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.445983 5072 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3b41acd5-2f6b-48fd-a9ba-55796e6db653-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.514814 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q7b24" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.547223 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.547750 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f06875c2-245b-4e2b-8746-5ebc4ecee456-serving-cert\") pod \"controller-manager-554bf44686-cvpls\" (UID: \"f06875c2-245b-4e2b-8746-5ebc4ecee456\") " pod="openshift-controller-manager/controller-manager-554bf44686-cvpls" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.547827 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/91bd618e-7328-48ac-b2d9-23f924c1405f-client-ca\") pod \"route-controller-manager-74667d9b5d-hz4rs\" (UID: \"91bd618e-7328-48ac-b2d9-23f924c1405f\") " pod="openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.547900 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f06875c2-245b-4e2b-8746-5ebc4ecee456-client-ca\") pod \"controller-manager-554bf44686-cvpls\" (UID: \"f06875c2-245b-4e2b-8746-5ebc4ecee456\") " pod="openshift-controller-manager/controller-manager-554bf44686-cvpls" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.548015 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91bd618e-7328-48ac-b2d9-23f924c1405f-config\") pod \"route-controller-manager-74667d9b5d-hz4rs\" (UID: \"91bd618e-7328-48ac-b2d9-23f924c1405f\") " pod="openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.548124 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f06875c2-245b-4e2b-8746-5ebc4ecee456-config\") pod \"controller-manager-554bf44686-cvpls\" (UID: \"f06875c2-245b-4e2b-8746-5ebc4ecee456\") " pod="openshift-controller-manager/controller-manager-554bf44686-cvpls" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.548175 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f06875c2-245b-4e2b-8746-5ebc4ecee456-proxy-ca-bundles\") pod \"controller-manager-554bf44686-cvpls\" (UID: \"f06875c2-245b-4e2b-8746-5ebc4ecee456\") " pod="openshift-controller-manager/controller-manager-554bf44686-cvpls" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.548237 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91bd618e-7328-48ac-b2d9-23f924c1405f-serving-cert\") pod \"route-controller-manager-74667d9b5d-hz4rs\" (UID: \"91bd618e-7328-48ac-b2d9-23f924c1405f\") " pod="openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.548309 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl5mh\" (UniqueName: \"kubernetes.io/projected/91bd618e-7328-48ac-b2d9-23f924c1405f-kube-api-access-sl5mh\") pod \"route-controller-manager-74667d9b5d-hz4rs\" (UID: \"91bd618e-7328-48ac-b2d9-23f924c1405f\") " pod="openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs" Feb 28 04:13:22 crc kubenswrapper[5072]: E0228 04:13:22.548349 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:23.04832249 +0000 UTC m=+225.043052682 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.548401 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkxrk\" (UniqueName: \"kubernetes.io/projected/f06875c2-245b-4e2b-8746-5ebc4ecee456-kube-api-access-kkxrk\") pod \"controller-manager-554bf44686-cvpls\" (UID: \"f06875c2-245b-4e2b-8746-5ebc4ecee456\") " pod="openshift-controller-manager/controller-manager-554bf44686-cvpls" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.609139 5072 patch_prober.go:28] interesting pod/router-default-5444994796-9wdtp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 04:13:22 crc kubenswrapper[5072]: [-]has-synced failed: reason withheld Feb 28 04:13:22 crc kubenswrapper[5072]: [+]process-running ok Feb 28 04:13:22 crc kubenswrapper[5072]: healthz check failed Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.609205 5072 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9wdtp" podUID="49976b40-ea77-4857-a99f-f4a65df82e05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.649925 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f06875c2-245b-4e2b-8746-5ebc4ecee456-config\") pod \"controller-manager-554bf44686-cvpls\" (UID: \"f06875c2-245b-4e2b-8746-5ebc4ecee456\") " pod="openshift-controller-manager/controller-manager-554bf44686-cvpls" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.649967 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f06875c2-245b-4e2b-8746-5ebc4ecee456-proxy-ca-bundles\") pod \"controller-manager-554bf44686-cvpls\" (UID: \"f06875c2-245b-4e2b-8746-5ebc4ecee456\") " pod="openshift-controller-manager/controller-manager-554bf44686-cvpls" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.649994 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91bd618e-7328-48ac-b2d9-23f924c1405f-serving-cert\") pod \"route-controller-manager-74667d9b5d-hz4rs\" (UID: \"91bd618e-7328-48ac-b2d9-23f924c1405f\") " pod="openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.650026 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl5mh\" (UniqueName: \"kubernetes.io/projected/91bd618e-7328-48ac-b2d9-23f924c1405f-kube-api-access-sl5mh\") pod \"route-controller-manager-74667d9b5d-hz4rs\" (UID: \"91bd618e-7328-48ac-b2d9-23f924c1405f\") " pod="openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.650077 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkxrk\" (UniqueName: \"kubernetes.io/projected/f06875c2-245b-4e2b-8746-5ebc4ecee456-kube-api-access-kkxrk\") pod \"controller-manager-554bf44686-cvpls\" (UID: \"f06875c2-245b-4e2b-8746-5ebc4ecee456\") " pod="openshift-controller-manager/controller-manager-554bf44686-cvpls" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.650103 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f06875c2-245b-4e2b-8746-5ebc4ecee456-serving-cert\") pod \"controller-manager-554bf44686-cvpls\" (UID: \"f06875c2-245b-4e2b-8746-5ebc4ecee456\") " pod="openshift-controller-manager/controller-manager-554bf44686-cvpls" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.650125 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/91bd618e-7328-48ac-b2d9-23f924c1405f-client-ca\") pod \"route-controller-manager-74667d9b5d-hz4rs\" (UID: \"91bd618e-7328-48ac-b2d9-23f924c1405f\") " pod="openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.650151 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f06875c2-245b-4e2b-8746-5ebc4ecee456-client-ca\") pod \"controller-manager-554bf44686-cvpls\" (UID: \"f06875c2-245b-4e2b-8746-5ebc4ecee456\") " pod="openshift-controller-manager/controller-manager-554bf44686-cvpls" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.650171 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91bd618e-7328-48ac-b2d9-23f924c1405f-config\") pod \"route-controller-manager-74667d9b5d-hz4rs\" (UID: \"91bd618e-7328-48ac-b2d9-23f924c1405f\") " pod="openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.650191 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:22 crc kubenswrapper[5072]: E0228 04:13:22.650491 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:23.150478812 +0000 UTC m=+225.145209004 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.651988 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f06875c2-245b-4e2b-8746-5ebc4ecee456-config\") pod \"controller-manager-554bf44686-cvpls\" (UID: \"f06875c2-245b-4e2b-8746-5ebc4ecee456\") " pod="openshift-controller-manager/controller-manager-554bf44686-cvpls" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.652108 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/91bd618e-7328-48ac-b2d9-23f924c1405f-client-ca\") pod \"route-controller-manager-74667d9b5d-hz4rs\" (UID: \"91bd618e-7328-48ac-b2d9-23f924c1405f\") " pod="openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.653233 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f06875c2-245b-4e2b-8746-5ebc4ecee456-client-ca\") pod \"controller-manager-554bf44686-cvpls\" (UID: \"f06875c2-245b-4e2b-8746-5ebc4ecee456\") " pod="openshift-controller-manager/controller-manager-554bf44686-cvpls" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.653778 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91bd618e-7328-48ac-b2d9-23f924c1405f-config\") pod \"route-controller-manager-74667d9b5d-hz4rs\" (UID: \"91bd618e-7328-48ac-b2d9-23f924c1405f\") " pod="openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.657018 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f06875c2-245b-4e2b-8746-5ebc4ecee456-serving-cert\") pod \"controller-manager-554bf44686-cvpls\" (UID: \"f06875c2-245b-4e2b-8746-5ebc4ecee456\") " pod="openshift-controller-manager/controller-manager-554bf44686-cvpls" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.657563 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f06875c2-245b-4e2b-8746-5ebc4ecee456-proxy-ca-bundles\") pod \"controller-manager-554bf44686-cvpls\" (UID: \"f06875c2-245b-4e2b-8746-5ebc4ecee456\") " pod="openshift-controller-manager/controller-manager-554bf44686-cvpls" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.658424 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91bd618e-7328-48ac-b2d9-23f924c1405f-serving-cert\") pod \"route-controller-manager-74667d9b5d-hz4rs\" (UID: \"91bd618e-7328-48ac-b2d9-23f924c1405f\") " pod="openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.676537 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkxrk\" (UniqueName: \"kubernetes.io/projected/f06875c2-245b-4e2b-8746-5ebc4ecee456-kube-api-access-kkxrk\") pod \"controller-manager-554bf44686-cvpls\" (UID: \"f06875c2-245b-4e2b-8746-5ebc4ecee456\") " pod="openshift-controller-manager/controller-manager-554bf44686-cvpls" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.680397 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl5mh\" (UniqueName: \"kubernetes.io/projected/91bd618e-7328-48ac-b2d9-23f924c1405f-kube-api-access-sl5mh\") pod \"route-controller-manager-74667d9b5d-hz4rs\" (UID: \"91bd618e-7328-48ac-b2d9-23f924c1405f\") " pod="openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.718404 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-554bf44686-cvpls" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.751060 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs" Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.751436 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:22 crc kubenswrapper[5072]: E0228 04:13:22.751989 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:23.251973671 +0000 UTC m=+225.246703863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.854355 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:22 crc kubenswrapper[5072]: E0228 04:13:22.854788 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:23.354766473 +0000 UTC m=+225.349496665 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.956775 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:22 crc kubenswrapper[5072]: E0228 04:13:22.957256 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:23.457205552 +0000 UTC m=+225.451935734 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:22 crc kubenswrapper[5072]: I0228 04:13:22.957483 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:22 crc kubenswrapper[5072]: E0228 04:13:22.957928 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:23.457919565 +0000 UTC m=+225.452649757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.059693 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:23 crc kubenswrapper[5072]: E0228 04:13:23.060481 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:23.560460807 +0000 UTC m=+225.555190999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.135785 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dcccb" event={"ID":"3b41acd5-2f6b-48fd-a9ba-55796e6db653","Type":"ContainerDied","Data":"5020feb8d5301cfd61745896ccbd8c1f8791245b4c8dd43d94960310d7c47b7a"} Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.135842 5072 scope.go:117] "RemoveContainer" containerID="67c2a17ca7a758951d1bff9a14bb12a1de825717f63d7a3d3f74a65fe2a48f2e" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.135981 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dcccb" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.142442 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j55ft" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.142921 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j55ft" event={"ID":"c7d7baf9-5adb-4cc1-a07b-d093bcd78e30","Type":"ContainerDied","Data":"aa791c5076cac2053e8fd2feebde7a75347f36327bbf9e018ac6e11108bd3049"} Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.161667 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:23 crc kubenswrapper[5072]: E0228 04:13:23.162222 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:23.662202266 +0000 UTC m=+225.656932458 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.191731 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dcccb"] Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.193271 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs"] Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.195616 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dcccb"] Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.221448 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-j55ft"] Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.225215 5072 scope.go:117] "RemoveContainer" containerID="cdbeeed7d6697f3f1580ffc4f26dd1a3313486578e66be025ddb8d5720b6a621" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.225394 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-j55ft"] Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.253928 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-554bf44686-cvpls"] Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.264027 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:23 crc kubenswrapper[5072]: E0228 04:13:23.264553 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:23.764521952 +0000 UTC m=+225.759252144 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.318543 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g7hzc"] Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.319863 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g7hzc" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.326934 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.342947 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g7hzc"] Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.366870 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:23 crc kubenswrapper[5072]: E0228 04:13:23.367362 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:23.867342223 +0000 UTC m=+225.862072415 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.467576 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:23 crc kubenswrapper[5072]: E0228 04:13:23.467738 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:23.96771482 +0000 UTC m=+225.962445012 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.467924 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rhc6\" (UniqueName: \"kubernetes.io/projected/11731378-2c2a-448a-918f-2e2f07619ee0-kube-api-access-4rhc6\") pod \"community-operators-g7hzc\" (UID: \"11731378-2c2a-448a-918f-2e2f07619ee0\") " pod="openshift-marketplace/community-operators-g7hzc" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.467963 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11731378-2c2a-448a-918f-2e2f07619ee0-catalog-content\") pod \"community-operators-g7hzc\" (UID: \"11731378-2c2a-448a-918f-2e2f07619ee0\") " pod="openshift-marketplace/community-operators-g7hzc" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.468059 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.468122 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11731378-2c2a-448a-918f-2e2f07619ee0-utilities\") pod \"community-operators-g7hzc\" (UID: \"11731378-2c2a-448a-918f-2e2f07619ee0\") " pod="openshift-marketplace/community-operators-g7hzc" Feb 28 04:13:23 crc kubenswrapper[5072]: E0228 04:13:23.468420 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:23.9684045 +0000 UTC m=+225.963134692 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.515414 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b4wcw"] Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.516735 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b4wcw" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.518980 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.537971 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b4wcw"] Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.540634 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l4r96" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.569454 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.570079 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11731378-2c2a-448a-918f-2e2f07619ee0-utilities\") pod \"community-operators-g7hzc\" (UID: \"11731378-2c2a-448a-918f-2e2f07619ee0\") " pod="openshift-marketplace/community-operators-g7hzc" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.570156 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rhc6\" (UniqueName: \"kubernetes.io/projected/11731378-2c2a-448a-918f-2e2f07619ee0-kube-api-access-4rhc6\") pod \"community-operators-g7hzc\" (UID: \"11731378-2c2a-448a-918f-2e2f07619ee0\") " pod="openshift-marketplace/community-operators-g7hzc" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.570188 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11731378-2c2a-448a-918f-2e2f07619ee0-catalog-content\") pod \"community-operators-g7hzc\" (UID: \"11731378-2c2a-448a-918f-2e2f07619ee0\") " pod="openshift-marketplace/community-operators-g7hzc" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.570696 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11731378-2c2a-448a-918f-2e2f07619ee0-catalog-content\") pod \"community-operators-g7hzc\" (UID: \"11731378-2c2a-448a-918f-2e2f07619ee0\") " pod="openshift-marketplace/community-operators-g7hzc" Feb 28 04:13:23 crc kubenswrapper[5072]: E0228 04:13:23.571090 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:24.071069828 +0000 UTC m=+226.065800020 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.579389 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11731378-2c2a-448a-918f-2e2f07619ee0-utilities\") pod \"community-operators-g7hzc\" (UID: \"11731378-2c2a-448a-918f-2e2f07619ee0\") " pod="openshift-marketplace/community-operators-g7hzc" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.597531 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rhc6\" (UniqueName: \"kubernetes.io/projected/11731378-2c2a-448a-918f-2e2f07619ee0-kube-api-access-4rhc6\") pod \"community-operators-g7hzc\" (UID: \"11731378-2c2a-448a-918f-2e2f07619ee0\") " pod="openshift-marketplace/community-operators-g7hzc" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.614034 5072 patch_prober.go:28] interesting pod/router-default-5444994796-9wdtp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 04:13:23 crc kubenswrapper[5072]: [-]has-synced failed: reason withheld Feb 28 04:13:23 crc kubenswrapper[5072]: [+]process-running ok Feb 28 04:13:23 crc kubenswrapper[5072]: healthz check failed Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.614098 5072 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9wdtp" podUID="49976b40-ea77-4857-a99f-f4a65df82e05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.672374 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkkj4\" (UniqueName: \"kubernetes.io/projected/9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23-kube-api-access-wkkj4\") pod \"certified-operators-b4wcw\" (UID: \"9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23\") " pod="openshift-marketplace/certified-operators-b4wcw" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.672458 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23-utilities\") pod \"certified-operators-b4wcw\" (UID: \"9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23\") " pod="openshift-marketplace/certified-operators-b4wcw" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.672504 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:23 crc kubenswrapper[5072]: E0228 04:13:23.672985 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:24.17296147 +0000 UTC m=+226.167691662 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.673047 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23-catalog-content\") pod \"certified-operators-b4wcw\" (UID: \"9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23\") " pod="openshift-marketplace/certified-operators-b4wcw" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.719920 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cpbd5"] Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.721864 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cpbd5" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.730818 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cpbd5"] Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.744468 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g7hzc" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.776766 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.777172 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkkj4\" (UniqueName: \"kubernetes.io/projected/9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23-kube-api-access-wkkj4\") pod \"certified-operators-b4wcw\" (UID: \"9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23\") " pod="openshift-marketplace/certified-operators-b4wcw" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.777273 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23-utilities\") pod \"certified-operators-b4wcw\" (UID: \"9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23\") " pod="openshift-marketplace/certified-operators-b4wcw" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.777436 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23-catalog-content\") pod \"certified-operators-b4wcw\" (UID: \"9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23\") " pod="openshift-marketplace/certified-operators-b4wcw" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.778285 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23-catalog-content\") pod \"certified-operators-b4wcw\" (UID: \"9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23\") " pod="openshift-marketplace/certified-operators-b4wcw" Feb 28 04:13:23 crc kubenswrapper[5072]: E0228 04:13:23.784594 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:24.284567934 +0000 UTC m=+226.279298126 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.784996 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23-utilities\") pod \"certified-operators-b4wcw\" (UID: \"9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23\") " pod="openshift-marketplace/certified-operators-b4wcw" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.825051 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkkj4\" (UniqueName: \"kubernetes.io/projected/9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23-kube-api-access-wkkj4\") pod \"certified-operators-b4wcw\" (UID: \"9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23\") " pod="openshift-marketplace/certified-operators-b4wcw" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.869506 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b4wcw" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.880671 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0f264e9-eb98-4462-8a17-0dc4071f6b96-utilities\") pod \"community-operators-cpbd5\" (UID: \"a0f264e9-eb98-4462-8a17-0dc4071f6b96\") " pod="openshift-marketplace/community-operators-cpbd5" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.880743 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0f264e9-eb98-4462-8a17-0dc4071f6b96-catalog-content\") pod \"community-operators-cpbd5\" (UID: \"a0f264e9-eb98-4462-8a17-0dc4071f6b96\") " pod="openshift-marketplace/community-operators-cpbd5" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.880765 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8td77\" (UniqueName: \"kubernetes.io/projected/a0f264e9-eb98-4462-8a17-0dc4071f6b96-kube-api-access-8td77\") pod \"community-operators-cpbd5\" (UID: \"a0f264e9-eb98-4462-8a17-0dc4071f6b96\") " pod="openshift-marketplace/community-operators-cpbd5" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.880833 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:23 crc kubenswrapper[5072]: E0228 04:13:23.881194 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:24.381168903 +0000 UTC m=+226.375899095 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.917772 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bkl6g"] Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.918951 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bkl6g" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.942081 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bkl6g"] Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.987338 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:23 crc kubenswrapper[5072]: E0228 04:13:23.987657 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:24.487610927 +0000 UTC m=+226.482341119 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.987755 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.987804 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0f264e9-eb98-4462-8a17-0dc4071f6b96-utilities\") pod \"community-operators-cpbd5\" (UID: \"a0f264e9-eb98-4462-8a17-0dc4071f6b96\") " pod="openshift-marketplace/community-operators-cpbd5" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.987835 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0f264e9-eb98-4462-8a17-0dc4071f6b96-catalog-content\") pod \"community-operators-cpbd5\" (UID: \"a0f264e9-eb98-4462-8a17-0dc4071f6b96\") " pod="openshift-marketplace/community-operators-cpbd5" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.987851 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8td77\" (UniqueName: \"kubernetes.io/projected/a0f264e9-eb98-4462-8a17-0dc4071f6b96-kube-api-access-8td77\") pod \"community-operators-cpbd5\" (UID: \"a0f264e9-eb98-4462-8a17-0dc4071f6b96\") " pod="openshift-marketplace/community-operators-cpbd5" Feb 28 04:13:23 crc kubenswrapper[5072]: E0228 04:13:23.988165 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:24.488150084 +0000 UTC m=+226.482880276 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.988653 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0f264e9-eb98-4462-8a17-0dc4071f6b96-utilities\") pod \"community-operators-cpbd5\" (UID: \"a0f264e9-eb98-4462-8a17-0dc4071f6b96\") " pod="openshift-marketplace/community-operators-cpbd5" Feb 28 04:13:23 crc kubenswrapper[5072]: I0228 04:13:23.988879 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0f264e9-eb98-4462-8a17-0dc4071f6b96-catalog-content\") pod \"community-operators-cpbd5\" (UID: \"a0f264e9-eb98-4462-8a17-0dc4071f6b96\") " pod="openshift-marketplace/community-operators-cpbd5" Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.018871 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8td77\" (UniqueName: \"kubernetes.io/projected/a0f264e9-eb98-4462-8a17-0dc4071f6b96-kube-api-access-8td77\") pod \"community-operators-cpbd5\" (UID: \"a0f264e9-eb98-4462-8a17-0dc4071f6b96\") " pod="openshift-marketplace/community-operators-cpbd5" Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.040238 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cpbd5" Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.089284 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.090204 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9932ea0b-dd67-4ffb-a303-3ba7b97730ef-utilities\") pod \"certified-operators-bkl6g\" (UID: \"9932ea0b-dd67-4ffb-a303-3ba7b97730ef\") " pod="openshift-marketplace/certified-operators-bkl6g" Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.090264 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9932ea0b-dd67-4ffb-a303-3ba7b97730ef-catalog-content\") pod \"certified-operators-bkl6g\" (UID: \"9932ea0b-dd67-4ffb-a303-3ba7b97730ef\") " pod="openshift-marketplace/certified-operators-bkl6g" Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.090333 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhh42\" (UniqueName: \"kubernetes.io/projected/9932ea0b-dd67-4ffb-a303-3ba7b97730ef-kube-api-access-rhh42\") pod \"certified-operators-bkl6g\" (UID: \"9932ea0b-dd67-4ffb-a303-3ba7b97730ef\") " pod="openshift-marketplace/certified-operators-bkl6g" Feb 28 04:13:24 crc kubenswrapper[5072]: E0228 04:13:24.090436 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:24.590421838 +0000 UTC m=+226.585152030 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.186305 5072 generic.go:334] "Generic (PLEG): container finished" podID="bd0cadda-3f93-43ac-b288-7e666a7f1b99" containerID="0452223c3b5f745fffaf5b0397b4c77817e783f3dbbd83c1b854cbf448faa7a1" exitCode=0 Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.186430 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-d72cz" event={"ID":"bd0cadda-3f93-43ac-b288-7e666a7f1b99","Type":"ContainerDied","Data":"0452223c3b5f745fffaf5b0397b4c77817e783f3dbbd83c1b854cbf448faa7a1"} Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.191653 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhh42\" (UniqueName: \"kubernetes.io/projected/9932ea0b-dd67-4ffb-a303-3ba7b97730ef-kube-api-access-rhh42\") pod \"certified-operators-bkl6g\" (UID: \"9932ea0b-dd67-4ffb-a303-3ba7b97730ef\") " pod="openshift-marketplace/certified-operators-bkl6g" Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.196725 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.196809 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9932ea0b-dd67-4ffb-a303-3ba7b97730ef-utilities\") pod \"certified-operators-bkl6g\" (UID: \"9932ea0b-dd67-4ffb-a303-3ba7b97730ef\") " pod="openshift-marketplace/certified-operators-bkl6g" Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.196887 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9932ea0b-dd67-4ffb-a303-3ba7b97730ef-catalog-content\") pod \"certified-operators-bkl6g\" (UID: \"9932ea0b-dd67-4ffb-a303-3ba7b97730ef\") " pod="openshift-marketplace/certified-operators-bkl6g" Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.199620 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9932ea0b-dd67-4ffb-a303-3ba7b97730ef-catalog-content\") pod \"certified-operators-bkl6g\" (UID: \"9932ea0b-dd67-4ffb-a303-3ba7b97730ef\") " pod="openshift-marketplace/certified-operators-bkl6g" Feb 28 04:13:24 crc kubenswrapper[5072]: E0228 04:13:24.200139 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:24.700126764 +0000 UTC m=+226.694856956 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.200483 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9932ea0b-dd67-4ffb-a303-3ba7b97730ef-utilities\") pod \"certified-operators-bkl6g\" (UID: \"9932ea0b-dd67-4ffb-a303-3ba7b97730ef\") " pod="openshift-marketplace/certified-operators-bkl6g" Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.224617 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhh42\" (UniqueName: \"kubernetes.io/projected/9932ea0b-dd67-4ffb-a303-3ba7b97730ef-kube-api-access-rhh42\") pod \"certified-operators-bkl6g\" (UID: \"9932ea0b-dd67-4ffb-a303-3ba7b97730ef\") " pod="openshift-marketplace/certified-operators-bkl6g" Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.233981 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs" event={"ID":"91bd618e-7328-48ac-b2d9-23f924c1405f","Type":"ContainerStarted","Data":"5a0386cf627eb07759095a74d0bbdb4ad01e0759a35b1d9d2c3c58c576971184"} Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.234050 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs" event={"ID":"91bd618e-7328-48ac-b2d9-23f924c1405f","Type":"ContainerStarted","Data":"9205f742f32d86e083b5ec9cd361075e5e65f35ed8297c39654c75839ad98d50"} Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.235184 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs" Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.239377 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bkl6g" Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.259860 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs" podStartSLOduration=2.259836788 podStartE2EDuration="2.259836788s" podCreationTimestamp="2026-02-28 04:13:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:24.258136804 +0000 UTC m=+226.252866996" watchObservedRunningTime="2026-02-28 04:13:24.259836788 +0000 UTC m=+226.254566980" Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.272716 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-554bf44686-cvpls" event={"ID":"f06875c2-245b-4e2b-8746-5ebc4ecee456","Type":"ContainerStarted","Data":"668d0462d3b77e594cc3f0740ea98b8f449f61ebc644a9776514a412f621a31a"} Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.272777 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-554bf44686-cvpls" event={"ID":"f06875c2-245b-4e2b-8746-5ebc4ecee456","Type":"ContainerStarted","Data":"461d9d1615e27063f32c28e0ed1aff79d25b78c494a897bae58c81659ee32428"} Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.273465 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-554bf44686-cvpls" Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.279999 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs" Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.298011 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-554bf44686-cvpls" Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.298594 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:24 crc kubenswrapper[5072]: E0228 04:13:24.299020 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:24.799006184 +0000 UTC m=+226.793736376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.305115 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-554bf44686-cvpls" podStartSLOduration=2.305096133 podStartE2EDuration="2.305096133s" podCreationTimestamp="2026-02-28 04:13:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:24.30501577 +0000 UTC m=+226.299745962" watchObservedRunningTime="2026-02-28 04:13:24.305096133 +0000 UTC m=+226.299826315" Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.353025 5072 patch_prober.go:28] interesting pod/downloads-7954f5f757-mzmcb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.353081 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mzmcb" podUID="dbdad8a2-b26c-4587-8a9c-cdf96b65c15f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.353424 5072 patch_prober.go:28] interesting pod/downloads-7954f5f757-mzmcb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.353447 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-mzmcb" podUID="dbdad8a2-b26c-4587-8a9c-cdf96b65c15f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.391279 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.401906 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:24 crc kubenswrapper[5072]: E0228 04:13:24.402711 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:24.902685732 +0000 UTC m=+226.897415924 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.433141 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g7hzc"] Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.524148 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:24 crc kubenswrapper[5072]: E0228 04:13:24.524556 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:25.024520903 +0000 UTC m=+227.019251105 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.526463 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:24 crc kubenswrapper[5072]: E0228 04:13:24.532387 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:25.032369907 +0000 UTC m=+227.027100109 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.559192 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.577855 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-49m8v" Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.580549 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.580590 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.583467 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qvn7m" Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.603986 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b4wcw"] Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.606670 5072 patch_prober.go:28] interesting pod/apiserver-76f77b778f-sb9bc container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 28 04:13:24 crc kubenswrapper[5072]: [+]log ok Feb 28 04:13:24 crc kubenswrapper[5072]: [+]etcd ok Feb 28 04:13:24 crc kubenswrapper[5072]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 28 04:13:24 crc kubenswrapper[5072]: [+]poststarthook/generic-apiserver-start-informers ok Feb 28 04:13:24 crc kubenswrapper[5072]: [+]poststarthook/max-in-flight-filter ok Feb 28 04:13:24 crc kubenswrapper[5072]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 28 04:13:24 crc kubenswrapper[5072]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 28 04:13:24 crc kubenswrapper[5072]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 28 04:13:24 crc kubenswrapper[5072]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Feb 28 04:13:24 crc kubenswrapper[5072]: [+]poststarthook/project.openshift.io-projectcache ok Feb 28 04:13:24 crc kubenswrapper[5072]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Feb 28 04:13:24 crc kubenswrapper[5072]: [+]poststarthook/openshift.io-startinformers ok Feb 28 04:13:24 crc kubenswrapper[5072]: [+]poststarthook/openshift.io-restmapperupdater ok Feb 28 04:13:24 crc kubenswrapper[5072]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 28 04:13:24 crc kubenswrapper[5072]: livez check failed Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.606729 5072 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" podUID="c70f8b69-6dcb-4aa3-9823-b7cf8d6329e8" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.613455 5072 patch_prober.go:28] interesting pod/router-default-5444994796-9wdtp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 04:13:24 crc kubenswrapper[5072]: [-]has-synced failed: reason withheld Feb 28 04:13:24 crc kubenswrapper[5072]: [+]process-running ok Feb 28 04:13:24 crc kubenswrapper[5072]: healthz check failed Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.613521 5072 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9wdtp" podUID="49976b40-ea77-4857-a99f-f4a65df82e05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.628075 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:24 crc kubenswrapper[5072]: E0228 04:13:24.628168 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:25.12815239 +0000 UTC m=+227.122882572 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.628490 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:24 crc kubenswrapper[5072]: E0228 04:13:24.631497 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:25.131483304 +0000 UTC m=+227.126213496 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.733272 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:24 crc kubenswrapper[5072]: E0228 04:13:24.733832 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:25.23381107 +0000 UTC m=+227.228541262 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.753599 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b41acd5-2f6b-48fd-a9ba-55796e6db653" path="/var/lib/kubelet/pods/3b41acd5-2f6b-48fd-a9ba-55796e6db653/volumes" Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.754268 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7d7baf9-5adb-4cc1-a07b-d093bcd78e30" path="/var/lib/kubelet/pods/c7d7baf9-5adb-4cc1-a07b-d093bcd78e30/volumes" Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.838725 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:24 crc kubenswrapper[5072]: E0228 04:13:24.839002 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:25.338990965 +0000 UTC m=+227.333721157 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.856918 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cpbd5"] Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.877990 5072 ???:1] "http: TLS handshake error from 192.168.126.11:50450: no serving certificate available for the kubelet" Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.941282 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:24 crc kubenswrapper[5072]: E0228 04:13:24.941561 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:25.441545958 +0000 UTC m=+227.436276150 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:24 crc kubenswrapper[5072]: I0228 04:13:24.980460 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bkl6g"] Feb 28 04:13:25 crc kubenswrapper[5072]: W0228 04:13:25.012428 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9932ea0b_dd67_4ffb_a303_3ba7b97730ef.slice/crio-422f9715aab56b28df89145d4595997b85136a303fdc202f7d0aa3f548d170d5 WatchSource:0}: Error finding container 422f9715aab56b28df89145d4595997b85136a303fdc202f7d0aa3f548d170d5: Status 404 returned error can't find the container with id 422f9715aab56b28df89145d4595997b85136a303fdc202f7d0aa3f548d170d5 Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.043969 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:25 crc kubenswrapper[5072]: E0228 04:13:25.044307 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:25.544295107 +0000 UTC m=+227.539025299 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.144850 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:25 crc kubenswrapper[5072]: E0228 04:13:25.145526 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:25.645497079 +0000 UTC m=+227.640227271 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.145613 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:25 crc kubenswrapper[5072]: E0228 04:13:25.146142 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:25.646120918 +0000 UTC m=+227.640851110 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.209622 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-58vm7" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.210512 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-58vm7" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.212157 5072 patch_prober.go:28] interesting pod/console-f9d7485db-58vm7 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.212205 5072 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-58vm7" podUID="0bc940ed-4de2-4e88-98d7-8d9de59cd63d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.246457 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:25 crc kubenswrapper[5072]: E0228 04:13:25.247544 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:25.747503016 +0000 UTC m=+227.742233228 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.270179 5072 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.283792 5072 generic.go:334] "Generic (PLEG): container finished" podID="9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23" containerID="35570e0ac5cc1432a661cc44d86116308570a1f36f7f1d227b23ec043ac9e2f1" exitCode=0 Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.283854 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b4wcw" event={"ID":"9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23","Type":"ContainerDied","Data":"35570e0ac5cc1432a661cc44d86116308570a1f36f7f1d227b23ec043ac9e2f1"} Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.283881 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b4wcw" event={"ID":"9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23","Type":"ContainerStarted","Data":"569779664f7de75250a9a132151d2fd297579f066c8d1d1c75f0ec9fe5c16c37"} Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.295293 5072 generic.go:334] "Generic (PLEG): container finished" podID="a0f264e9-eb98-4462-8a17-0dc4071f6b96" containerID="68e993abad76a3fedafcb441fe67b8278bfdcf5f9df81f866b069142ce99a6e8" exitCode=0 Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.295385 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cpbd5" event={"ID":"a0f264e9-eb98-4462-8a17-0dc4071f6b96","Type":"ContainerDied","Data":"68e993abad76a3fedafcb441fe67b8278bfdcf5f9df81f866b069142ce99a6e8"} Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.295427 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cpbd5" event={"ID":"a0f264e9-eb98-4462-8a17-0dc4071f6b96","Type":"ContainerStarted","Data":"0250d0b5bb40464bf2f386c78c4a18882c23849a44adbe9a2116d217105f5f74"} Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.313387 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bkl6g" event={"ID":"9932ea0b-dd67-4ffb-a303-3ba7b97730ef","Type":"ContainerStarted","Data":"8cc0adc3e6f35342eccfa44b64faa6fe88f6f6a8adc3268fea50554b0f5a9de0"} Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.313439 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bkl6g" event={"ID":"9932ea0b-dd67-4ffb-a303-3ba7b97730ef","Type":"ContainerStarted","Data":"422f9715aab56b28df89145d4595997b85136a303fdc202f7d0aa3f548d170d5"} Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.329587 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gw7ld"] Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.330700 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mvt85" event={"ID":"0caa7e2d-d3a6-4df8-a734-0d607ec64f5c","Type":"ContainerStarted","Data":"efd398c46a5762f83862d04f3a9ce8052f07837aa2a0a4f2a834390af8423a9a"} Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.336883 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mvt85" event={"ID":"0caa7e2d-d3a6-4df8-a734-0d607ec64f5c","Type":"ContainerStarted","Data":"b15633e8a260d83715678eb3c6979eeb787dcd5ee37c476dbe6af6590f70b92e"} Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.331742 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gw7ld" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.340896 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.349676 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjqsz\" (UniqueName: \"kubernetes.io/projected/96c8a41b-5700-46e9-bea3-aac12066069f-kube-api-access-mjqsz\") pod \"redhat-marketplace-gw7ld\" (UID: \"96c8a41b-5700-46e9-bea3-aac12066069f\") " pod="openshift-marketplace/redhat-marketplace-gw7ld" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.349756 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.349818 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96c8a41b-5700-46e9-bea3-aac12066069f-utilities\") pod \"redhat-marketplace-gw7ld\" (UID: \"96c8a41b-5700-46e9-bea3-aac12066069f\") " pod="openshift-marketplace/redhat-marketplace-gw7ld" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.349857 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96c8a41b-5700-46e9-bea3-aac12066069f-catalog-content\") pod \"redhat-marketplace-gw7ld\" (UID: \"96c8a41b-5700-46e9-bea3-aac12066069f\") " pod="openshift-marketplace/redhat-marketplace-gw7ld" Feb 28 04:13:25 crc kubenswrapper[5072]: E0228 04:13:25.351017 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:25.851004348 +0000 UTC m=+227.845734540 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.367242 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gw7ld"] Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.367636 5072 generic.go:334] "Generic (PLEG): container finished" podID="11731378-2c2a-448a-918f-2e2f07619ee0" containerID="5b583e09fa2dbaa146d5a86d3fbc5f8d6cbbfdc14534151497eddbb164611924" exitCode=0 Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.368525 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7hzc" event={"ID":"11731378-2c2a-448a-918f-2e2f07619ee0","Type":"ContainerDied","Data":"5b583e09fa2dbaa146d5a86d3fbc5f8d6cbbfdc14534151497eddbb164611924"} Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.368558 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7hzc" event={"ID":"11731378-2c2a-448a-918f-2e2f07619ee0","Type":"ContainerStarted","Data":"eaf73c18b34021ac2539b6035a061043bab001f53b0e5958a163786405345ef7"} Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.456425 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.456589 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96c8a41b-5700-46e9-bea3-aac12066069f-utilities\") pod \"redhat-marketplace-gw7ld\" (UID: \"96c8a41b-5700-46e9-bea3-aac12066069f\") " pod="openshift-marketplace/redhat-marketplace-gw7ld" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.456621 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96c8a41b-5700-46e9-bea3-aac12066069f-catalog-content\") pod \"redhat-marketplace-gw7ld\" (UID: \"96c8a41b-5700-46e9-bea3-aac12066069f\") " pod="openshift-marketplace/redhat-marketplace-gw7ld" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.456669 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjqsz\" (UniqueName: \"kubernetes.io/projected/96c8a41b-5700-46e9-bea3-aac12066069f-kube-api-access-mjqsz\") pod \"redhat-marketplace-gw7ld\" (UID: \"96c8a41b-5700-46e9-bea3-aac12066069f\") " pod="openshift-marketplace/redhat-marketplace-gw7ld" Feb 28 04:13:25 crc kubenswrapper[5072]: E0228 04:13:25.457027 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:25.957008969 +0000 UTC m=+227.951739161 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.457539 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96c8a41b-5700-46e9-bea3-aac12066069f-utilities\") pod \"redhat-marketplace-gw7ld\" (UID: \"96c8a41b-5700-46e9-bea3-aac12066069f\") " pod="openshift-marketplace/redhat-marketplace-gw7ld" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.457758 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96c8a41b-5700-46e9-bea3-aac12066069f-catalog-content\") pod \"redhat-marketplace-gw7ld\" (UID: \"96c8a41b-5700-46e9-bea3-aac12066069f\") " pod="openshift-marketplace/redhat-marketplace-gw7ld" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.500405 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjqsz\" (UniqueName: \"kubernetes.io/projected/96c8a41b-5700-46e9-bea3-aac12066069f-kube-api-access-mjqsz\") pod \"redhat-marketplace-gw7ld\" (UID: \"96c8a41b-5700-46e9-bea3-aac12066069f\") " pod="openshift-marketplace/redhat-marketplace-gw7ld" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.560746 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:25 crc kubenswrapper[5072]: E0228 04:13:25.563660 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:26.063628848 +0000 UTC m=+228.058359040 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.613930 5072 patch_prober.go:28] interesting pod/router-default-5444994796-9wdtp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 04:13:25 crc kubenswrapper[5072]: [-]has-synced failed: reason withheld Feb 28 04:13:25 crc kubenswrapper[5072]: [+]process-running ok Feb 28 04:13:25 crc kubenswrapper[5072]: healthz check failed Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.614014 5072 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9wdtp" podUID="49976b40-ea77-4857-a99f-f4a65df82e05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.662151 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:25 crc kubenswrapper[5072]: E0228 04:13:25.662354 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:26.162324372 +0000 UTC m=+228.157054564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.662510 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:25 crc kubenswrapper[5072]: E0228 04:13:25.662803 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:26.162790577 +0000 UTC m=+228.157520759 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.706780 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gw7ld" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.712692 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-d72cz" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.715713 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g7wqp"] Feb 28 04:13:25 crc kubenswrapper[5072]: E0228 04:13:25.715962 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd0cadda-3f93-43ac-b288-7e666a7f1b99" containerName="collect-profiles" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.715975 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd0cadda-3f93-43ac-b288-7e666a7f1b99" containerName="collect-profiles" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.716080 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd0cadda-3f93-43ac-b288-7e666a7f1b99" containerName="collect-profiles" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.717324 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g7wqp" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.742520 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g7wqp"] Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.763936 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:25 crc kubenswrapper[5072]: E0228 04:13:25.765115 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 04:13:26.265095272 +0000 UTC m=+228.259825464 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.857821 5072 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-28T04:13:25.270211361Z","Handler":null,"Name":""} Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.866444 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghwrz\" (UniqueName: \"kubernetes.io/projected/bd0cadda-3f93-43ac-b288-7e666a7f1b99-kube-api-access-ghwrz\") pod \"bd0cadda-3f93-43ac-b288-7e666a7f1b99\" (UID: \"bd0cadda-3f93-43ac-b288-7e666a7f1b99\") " Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.866503 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bd0cadda-3f93-43ac-b288-7e666a7f1b99-secret-volume\") pod \"bd0cadda-3f93-43ac-b288-7e666a7f1b99\" (UID: \"bd0cadda-3f93-43ac-b288-7e666a7f1b99\") " Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.866929 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bd0cadda-3f93-43ac-b288-7e666a7f1b99-config-volume\") pod \"bd0cadda-3f93-43ac-b288-7e666a7f1b99\" (UID: \"bd0cadda-3f93-43ac-b288-7e666a7f1b99\") " Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.867109 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1917348a-ad88-41e1-a1f1-215706769c5e-catalog-content\") pod \"redhat-marketplace-g7wqp\" (UID: \"1917348a-ad88-41e1-a1f1-215706769c5e\") " pod="openshift-marketplace/redhat-marketplace-g7wqp" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.867152 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phxxs\" (UniqueName: \"kubernetes.io/projected/1917348a-ad88-41e1-a1f1-215706769c5e-kube-api-access-phxxs\") pod \"redhat-marketplace-g7wqp\" (UID: \"1917348a-ad88-41e1-a1f1-215706769c5e\") " pod="openshift-marketplace/redhat-marketplace-g7wqp" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.867211 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.867271 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1917348a-ad88-41e1-a1f1-215706769c5e-utilities\") pod \"redhat-marketplace-g7wqp\" (UID: \"1917348a-ad88-41e1-a1f1-215706769c5e\") " pod="openshift-marketplace/redhat-marketplace-g7wqp" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.868529 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd0cadda-3f93-43ac-b288-7e666a7f1b99-config-volume" (OuterVolumeSpecName: "config-volume") pod "bd0cadda-3f93-43ac-b288-7e666a7f1b99" (UID: "bd0cadda-3f93-43ac-b288-7e666a7f1b99"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:13:25 crc kubenswrapper[5072]: E0228 04:13:25.868900 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 04:13:26.368883804 +0000 UTC m=+228.363614046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bw85j" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.871928 5072 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.871992 5072 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.876288 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd0cadda-3f93-43ac-b288-7e666a7f1b99-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bd0cadda-3f93-43ac-b288-7e666a7f1b99" (UID: "bd0cadda-3f93-43ac-b288-7e666a7f1b99"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.876748 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd0cadda-3f93-43ac-b288-7e666a7f1b99-kube-api-access-ghwrz" (OuterVolumeSpecName: "kube-api-access-ghwrz") pod "bd0cadda-3f93-43ac-b288-7e666a7f1b99" (UID: "bd0cadda-3f93-43ac-b288-7e666a7f1b99"). InnerVolumeSpecName "kube-api-access-ghwrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.901311 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.902419 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.914209 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.914423 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.929209 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.970147 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.970359 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phxxs\" (UniqueName: \"kubernetes.io/projected/1917348a-ad88-41e1-a1f1-215706769c5e-kube-api-access-phxxs\") pod \"redhat-marketplace-g7wqp\" (UID: \"1917348a-ad88-41e1-a1f1-215706769c5e\") " pod="openshift-marketplace/redhat-marketplace-g7wqp" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.970463 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1917348a-ad88-41e1-a1f1-215706769c5e-utilities\") pod \"redhat-marketplace-g7wqp\" (UID: \"1917348a-ad88-41e1-a1f1-215706769c5e\") " pod="openshift-marketplace/redhat-marketplace-g7wqp" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.970518 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1917348a-ad88-41e1-a1f1-215706769c5e-catalog-content\") pod \"redhat-marketplace-g7wqp\" (UID: \"1917348a-ad88-41e1-a1f1-215706769c5e\") " pod="openshift-marketplace/redhat-marketplace-g7wqp" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.970573 5072 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bd0cadda-3f93-43ac-b288-7e666a7f1b99-config-volume\") on node \"crc\" DevicePath \"\"" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.970589 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghwrz\" (UniqueName: \"kubernetes.io/projected/bd0cadda-3f93-43ac-b288-7e666a7f1b99-kube-api-access-ghwrz\") on node \"crc\" DevicePath \"\"" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.970601 5072 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bd0cadda-3f93-43ac-b288-7e666a7f1b99-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.971100 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1917348a-ad88-41e1-a1f1-215706769c5e-catalog-content\") pod \"redhat-marketplace-g7wqp\" (UID: \"1917348a-ad88-41e1-a1f1-215706769c5e\") " pod="openshift-marketplace/redhat-marketplace-g7wqp" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.971551 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1917348a-ad88-41e1-a1f1-215706769c5e-utilities\") pod \"redhat-marketplace-g7wqp\" (UID: \"1917348a-ad88-41e1-a1f1-215706769c5e\") " pod="openshift-marketplace/redhat-marketplace-g7wqp" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.986627 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 28 04:13:25 crc kubenswrapper[5072]: I0228 04:13:25.997132 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phxxs\" (UniqueName: \"kubernetes.io/projected/1917348a-ad88-41e1-a1f1-215706769c5e-kube-api-access-phxxs\") pod \"redhat-marketplace-g7wqp\" (UID: \"1917348a-ad88-41e1-a1f1-215706769c5e\") " pod="openshift-marketplace/redhat-marketplace-g7wqp" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.054708 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g7wqp" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.071495 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.071578 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54b9fa87-83eb-407b-a4cf-dd288ca28bb8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"54b9fa87-83eb-407b-a4cf-dd288ca28bb8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.071613 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54b9fa87-83eb-407b-a4cf-dd288ca28bb8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"54b9fa87-83eb-407b-a4cf-dd288ca28bb8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.088328 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gw7ld"] Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.088474 5072 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.088520 5072 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:26 crc kubenswrapper[5072]: W0228 04:13:26.108065 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96c8a41b_5700_46e9_bea3_aac12066069f.slice/crio-1a76b49afbcbe3ed71b53055d861f9fa26981cc4656539b1209f10ad8d23dfaf WatchSource:0}: Error finding container 1a76b49afbcbe3ed71b53055d861f9fa26981cc4656539b1209f10ad8d23dfaf: Status 404 returned error can't find the container with id 1a76b49afbcbe3ed71b53055d861f9fa26981cc4656539b1209f10ad8d23dfaf Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.154557 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bw85j\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.173321 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54b9fa87-83eb-407b-a4cf-dd288ca28bb8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"54b9fa87-83eb-407b-a4cf-dd288ca28bb8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.173375 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54b9fa87-83eb-407b-a4cf-dd288ca28bb8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"54b9fa87-83eb-407b-a4cf-dd288ca28bb8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.173449 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54b9fa87-83eb-407b-a4cf-dd288ca28bb8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"54b9fa87-83eb-407b-a4cf-dd288ca28bb8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.196182 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54b9fa87-83eb-407b-a4cf-dd288ca28bb8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"54b9fa87-83eb-407b-a4cf-dd288ca28bb8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.236173 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.254522 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.255217 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.261480 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.261656 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.267673 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.303227 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hxqzd" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.381096 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b0b253b9-c164-4de4-83f8-bc2387e3d520-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b0b253b9-c164-4de4-83f8-bc2387e3d520\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.382827 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b0b253b9-c164-4de4-83f8-bc2387e3d520-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b0b253b9-c164-4de4-83f8-bc2387e3d520\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.386489 5072 generic.go:334] "Generic (PLEG): container finished" podID="9932ea0b-dd67-4ffb-a303-3ba7b97730ef" containerID="8cc0adc3e6f35342eccfa44b64faa6fe88f6f6a8adc3268fea50554b0f5a9de0" exitCode=0 Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.386586 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bkl6g" event={"ID":"9932ea0b-dd67-4ffb-a303-3ba7b97730ef","Type":"ContainerDied","Data":"8cc0adc3e6f35342eccfa44b64faa6fe88f6f6a8adc3268fea50554b0f5a9de0"} Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.391518 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mvt85" event={"ID":"0caa7e2d-d3a6-4df8-a734-0d607ec64f5c","Type":"ContainerStarted","Data":"0399f4e027d38df49a61a50c2a97d86332840b12d9975d0bf78e46f142af837e"} Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.404074 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-d72cz" event={"ID":"bd0cadda-3f93-43ac-b288-7e666a7f1b99","Type":"ContainerDied","Data":"52df5c98b66a8e0a7ba12320d508fd3dfd88109d11377080cc4fc1b7ea6dfc5d"} Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.404155 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52df5c98b66a8e0a7ba12320d508fd3dfd88109d11377080cc4fc1b7ea6dfc5d" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.404300 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-d72cz" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.423182 5072 generic.go:334] "Generic (PLEG): container finished" podID="96c8a41b-5700-46e9-bea3-aac12066069f" containerID="a2002edfbc29f58566a267fabfba768cacd51644a5dabddadd91f1a4d0d9b638" exitCode=0 Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.423285 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gw7ld" event={"ID":"96c8a41b-5700-46e9-bea3-aac12066069f","Type":"ContainerDied","Data":"a2002edfbc29f58566a267fabfba768cacd51644a5dabddadd91f1a4d0d9b638"} Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.424344 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gw7ld" event={"ID":"96c8a41b-5700-46e9-bea3-aac12066069f","Type":"ContainerStarted","Data":"1a76b49afbcbe3ed71b53055d861f9fa26981cc4656539b1209f10ad8d23dfaf"} Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.430531 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.434261 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-mvt85" podStartSLOduration=13.434234703 podStartE2EDuration="13.434234703s" podCreationTimestamp="2026-02-28 04:13:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:26.428841355 +0000 UTC m=+228.423571557" watchObservedRunningTime="2026-02-28 04:13:26.434234703 +0000 UTC m=+228.428964895" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.464811 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g7wqp"] Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.484248 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b0b253b9-c164-4de4-83f8-bc2387e3d520-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b0b253b9-c164-4de4-83f8-bc2387e3d520\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.486284 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b0b253b9-c164-4de4-83f8-bc2387e3d520-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b0b253b9-c164-4de4-83f8-bc2387e3d520\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.486350 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b0b253b9-c164-4de4-83f8-bc2387e3d520-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b0b253b9-c164-4de4-83f8-bc2387e3d520\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 28 04:13:26 crc kubenswrapper[5072]: W0228 04:13:26.492021 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1917348a_ad88_41e1_a1f1_215706769c5e.slice/crio-4dbac56433ca8d5be7ed98724dcabf0fdb18b47ca3c8b480550ce9d1f6024534 WatchSource:0}: Error finding container 4dbac56433ca8d5be7ed98724dcabf0fdb18b47ca3c8b480550ce9d1f6024534: Status 404 returned error can't find the container with id 4dbac56433ca8d5be7ed98724dcabf0fdb18b47ca3c8b480550ce9d1f6024534 Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.504206 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b0b253b9-c164-4de4-83f8-bc2387e3d520-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b0b253b9-c164-4de4-83f8-bc2387e3d520\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.522748 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-67nm2"] Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.525861 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-67nm2" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.534880 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.540169 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-67nm2"] Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.583611 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-stshz" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.587223 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c68fedb7-ccc2-4f59-8b91-7a59776ccd1d-utilities\") pod \"redhat-operators-67nm2\" (UID: \"c68fedb7-ccc2-4f59-8b91-7a59776ccd1d\") " pod="openshift-marketplace/redhat-operators-67nm2" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.587293 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c7n7\" (UniqueName: \"kubernetes.io/projected/c68fedb7-ccc2-4f59-8b91-7a59776ccd1d-kube-api-access-6c7n7\") pod \"redhat-operators-67nm2\" (UID: \"c68fedb7-ccc2-4f59-8b91-7a59776ccd1d\") " pod="openshift-marketplace/redhat-operators-67nm2" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.587318 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c68fedb7-ccc2-4f59-8b91-7a59776ccd1d-catalog-content\") pod \"redhat-operators-67nm2\" (UID: \"c68fedb7-ccc2-4f59-8b91-7a59776ccd1d\") " pod="openshift-marketplace/redhat-operators-67nm2" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.594874 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.606873 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-9wdtp" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.611105 5072 patch_prober.go:28] interesting pod/router-default-5444994796-9wdtp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 04:13:26 crc kubenswrapper[5072]: [-]has-synced failed: reason withheld Feb 28 04:13:26 crc kubenswrapper[5072]: [+]process-running ok Feb 28 04:13:26 crc kubenswrapper[5072]: healthz check failed Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.611183 5072 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9wdtp" podUID="49976b40-ea77-4857-a99f-f4a65df82e05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.658918 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.688910 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c68fedb7-ccc2-4f59-8b91-7a59776ccd1d-utilities\") pod \"redhat-operators-67nm2\" (UID: \"c68fedb7-ccc2-4f59-8b91-7a59776ccd1d\") " pod="openshift-marketplace/redhat-operators-67nm2" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.688970 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c7n7\" (UniqueName: \"kubernetes.io/projected/c68fedb7-ccc2-4f59-8b91-7a59776ccd1d-kube-api-access-6c7n7\") pod \"redhat-operators-67nm2\" (UID: \"c68fedb7-ccc2-4f59-8b91-7a59776ccd1d\") " pod="openshift-marketplace/redhat-operators-67nm2" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.689005 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c68fedb7-ccc2-4f59-8b91-7a59776ccd1d-catalog-content\") pod \"redhat-operators-67nm2\" (UID: \"c68fedb7-ccc2-4f59-8b91-7a59776ccd1d\") " pod="openshift-marketplace/redhat-operators-67nm2" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.690155 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c68fedb7-ccc2-4f59-8b91-7a59776ccd1d-utilities\") pod \"redhat-operators-67nm2\" (UID: \"c68fedb7-ccc2-4f59-8b91-7a59776ccd1d\") " pod="openshift-marketplace/redhat-operators-67nm2" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.691069 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c68fedb7-ccc2-4f59-8b91-7a59776ccd1d-catalog-content\") pod \"redhat-operators-67nm2\" (UID: \"c68fedb7-ccc2-4f59-8b91-7a59776ccd1d\") " pod="openshift-marketplace/redhat-operators-67nm2" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.696869 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.744912 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c7n7\" (UniqueName: \"kubernetes.io/projected/c68fedb7-ccc2-4f59-8b91-7a59776ccd1d-kube-api-access-6c7n7\") pod \"redhat-operators-67nm2\" (UID: \"c68fedb7-ccc2-4f59-8b91-7a59776ccd1d\") " pod="openshift-marketplace/redhat-operators-67nm2" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.874455 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bw85j"] Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.898010 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-67nm2" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.922577 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2k7bm"] Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.923798 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2k7bm" Feb 28 04:13:26 crc kubenswrapper[5072]: I0228 04:13:26.939096 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2k7bm"] Feb 28 04:13:27 crc kubenswrapper[5072]: I0228 04:13:27.022180 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqcmq\" (UniqueName: \"kubernetes.io/projected/cd61caed-be31-4706-9677-0da76d2cb2e7-kube-api-access-pqcmq\") pod \"redhat-operators-2k7bm\" (UID: \"cd61caed-be31-4706-9677-0da76d2cb2e7\") " pod="openshift-marketplace/redhat-operators-2k7bm" Feb 28 04:13:27 crc kubenswrapper[5072]: I0228 04:13:27.022261 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd61caed-be31-4706-9677-0da76d2cb2e7-utilities\") pod \"redhat-operators-2k7bm\" (UID: \"cd61caed-be31-4706-9677-0da76d2cb2e7\") " pod="openshift-marketplace/redhat-operators-2k7bm" Feb 28 04:13:27 crc kubenswrapper[5072]: I0228 04:13:27.022415 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd61caed-be31-4706-9677-0da76d2cb2e7-catalog-content\") pod \"redhat-operators-2k7bm\" (UID: \"cd61caed-be31-4706-9677-0da76d2cb2e7\") " pod="openshift-marketplace/redhat-operators-2k7bm" Feb 28 04:13:27 crc kubenswrapper[5072]: I0228 04:13:27.029567 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 28 04:13:27 crc kubenswrapper[5072]: W0228 04:13:27.068190 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb0b253b9_c164_4de4_83f8_bc2387e3d520.slice/crio-01f5ac07290b168d0c3123d70f4d6b9edfd7525fd9f026700900e7bc8145c041 WatchSource:0}: Error finding container 01f5ac07290b168d0c3123d70f4d6b9edfd7525fd9f026700900e7bc8145c041: Status 404 returned error can't find the container with id 01f5ac07290b168d0c3123d70f4d6b9edfd7525fd9f026700900e7bc8145c041 Feb 28 04:13:27 crc kubenswrapper[5072]: I0228 04:13:27.124730 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd61caed-be31-4706-9677-0da76d2cb2e7-catalog-content\") pod \"redhat-operators-2k7bm\" (UID: \"cd61caed-be31-4706-9677-0da76d2cb2e7\") " pod="openshift-marketplace/redhat-operators-2k7bm" Feb 28 04:13:27 crc kubenswrapper[5072]: I0228 04:13:27.132931 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqcmq\" (UniqueName: \"kubernetes.io/projected/cd61caed-be31-4706-9677-0da76d2cb2e7-kube-api-access-pqcmq\") pod \"redhat-operators-2k7bm\" (UID: \"cd61caed-be31-4706-9677-0da76d2cb2e7\") " pod="openshift-marketplace/redhat-operators-2k7bm" Feb 28 04:13:27 crc kubenswrapper[5072]: I0228 04:13:27.133039 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd61caed-be31-4706-9677-0da76d2cb2e7-utilities\") pod \"redhat-operators-2k7bm\" (UID: \"cd61caed-be31-4706-9677-0da76d2cb2e7\") " pod="openshift-marketplace/redhat-operators-2k7bm" Feb 28 04:13:27 crc kubenswrapper[5072]: I0228 04:13:27.133786 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd61caed-be31-4706-9677-0da76d2cb2e7-utilities\") pod \"redhat-operators-2k7bm\" (UID: \"cd61caed-be31-4706-9677-0da76d2cb2e7\") " pod="openshift-marketplace/redhat-operators-2k7bm" Feb 28 04:13:27 crc kubenswrapper[5072]: I0228 04:13:27.134333 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd61caed-be31-4706-9677-0da76d2cb2e7-catalog-content\") pod \"redhat-operators-2k7bm\" (UID: \"cd61caed-be31-4706-9677-0da76d2cb2e7\") " pod="openshift-marketplace/redhat-operators-2k7bm" Feb 28 04:13:27 crc kubenswrapper[5072]: I0228 04:13:27.155929 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqcmq\" (UniqueName: \"kubernetes.io/projected/cd61caed-be31-4706-9677-0da76d2cb2e7-kube-api-access-pqcmq\") pod \"redhat-operators-2k7bm\" (UID: \"cd61caed-be31-4706-9677-0da76d2cb2e7\") " pod="openshift-marketplace/redhat-operators-2k7bm" Feb 28 04:13:27 crc kubenswrapper[5072]: I0228 04:13:27.294288 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2k7bm" Feb 28 04:13:27 crc kubenswrapper[5072]: I0228 04:13:27.448367 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-67nm2"] Feb 28 04:13:27 crc kubenswrapper[5072]: W0228 04:13:27.477491 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc68fedb7_ccc2_4f59_8b91_7a59776ccd1d.slice/crio-6810b1b67ea37348b2f21fbd87e6607f601a001be27f480d2d1e6af6ead94daa WatchSource:0}: Error finding container 6810b1b67ea37348b2f21fbd87e6607f601a001be27f480d2d1e6af6ead94daa: Status 404 returned error can't find the container with id 6810b1b67ea37348b2f21fbd87e6607f601a001be27f480d2d1e6af6ead94daa Feb 28 04:13:27 crc kubenswrapper[5072]: I0228 04:13:27.491531 5072 generic.go:334] "Generic (PLEG): container finished" podID="1917348a-ad88-41e1-a1f1-215706769c5e" containerID="be1479e106d2039416b8455ff0dbf2a6a833a9bbb3d0541d0b63b4edf5bc1bb5" exitCode=0 Feb 28 04:13:27 crc kubenswrapper[5072]: I0228 04:13:27.491632 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7wqp" event={"ID":"1917348a-ad88-41e1-a1f1-215706769c5e","Type":"ContainerDied","Data":"be1479e106d2039416b8455ff0dbf2a6a833a9bbb3d0541d0b63b4edf5bc1bb5"} Feb 28 04:13:27 crc kubenswrapper[5072]: I0228 04:13:27.493684 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7wqp" event={"ID":"1917348a-ad88-41e1-a1f1-215706769c5e","Type":"ContainerStarted","Data":"4dbac56433ca8d5be7ed98724dcabf0fdb18b47ca3c8b480550ce9d1f6024534"} Feb 28 04:13:27 crc kubenswrapper[5072]: I0228 04:13:27.501827 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b0b253b9-c164-4de4-83f8-bc2387e3d520","Type":"ContainerStarted","Data":"01f5ac07290b168d0c3123d70f4d6b9edfd7525fd9f026700900e7bc8145c041"} Feb 28 04:13:27 crc kubenswrapper[5072]: I0228 04:13:27.506899 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"54b9fa87-83eb-407b-a4cf-dd288ca28bb8","Type":"ContainerStarted","Data":"cf524f8dde2bf16723275002f1835963e68bc7a8b425b70361c2a87de7f44e08"} Feb 28 04:13:27 crc kubenswrapper[5072]: I0228 04:13:27.512257 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" event={"ID":"3b94a919-0f97-48a8-aac9-4f52655d572d","Type":"ContainerStarted","Data":"0b250a39bf5d029d6a56707f786f62b6097b43073dfe972869b95a2ecae80fd1"} Feb 28 04:13:27 crc kubenswrapper[5072]: I0228 04:13:27.512304 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" event={"ID":"3b94a919-0f97-48a8-aac9-4f52655d572d","Type":"ContainerStarted","Data":"3da31d7e10c6625b7b409ddef629c06429c6527356b29682cdf6571a6986dc5a"} Feb 28 04:13:27 crc kubenswrapper[5072]: I0228 04:13:27.611952 5072 patch_prober.go:28] interesting pod/router-default-5444994796-9wdtp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 04:13:27 crc kubenswrapper[5072]: [-]has-synced failed: reason withheld Feb 28 04:13:27 crc kubenswrapper[5072]: [+]process-running ok Feb 28 04:13:27 crc kubenswrapper[5072]: healthz check failed Feb 28 04:13:27 crc kubenswrapper[5072]: I0228 04:13:27.612012 5072 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9wdtp" podUID="49976b40-ea77-4857-a99f-f4a65df82e05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 04:13:27 crc kubenswrapper[5072]: I0228 04:13:27.685823 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2k7bm"] Feb 28 04:13:27 crc kubenswrapper[5072]: W0228 04:13:27.728149 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd61caed_be31_4706_9677_0da76d2cb2e7.slice/crio-9aff94aad5af4de13bbf65e72a8ce74b1cde16f71835d008bee91127c94811fc WatchSource:0}: Error finding container 9aff94aad5af4de13bbf65e72a8ce74b1cde16f71835d008bee91127c94811fc: Status 404 returned error can't find the container with id 9aff94aad5af4de13bbf65e72a8ce74b1cde16f71835d008bee91127c94811fc Feb 28 04:13:28 crc kubenswrapper[5072]: I0228 04:13:28.536590 5072 generic.go:334] "Generic (PLEG): container finished" podID="c68fedb7-ccc2-4f59-8b91-7a59776ccd1d" containerID="b465d8e613e43666c53e177c825db98fc3c1e47bc9af171ee391277d6a589d75" exitCode=0 Feb 28 04:13:28 crc kubenswrapper[5072]: I0228 04:13:28.536732 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-67nm2" event={"ID":"c68fedb7-ccc2-4f59-8b91-7a59776ccd1d","Type":"ContainerDied","Data":"b465d8e613e43666c53e177c825db98fc3c1e47bc9af171ee391277d6a589d75"} Feb 28 04:13:28 crc kubenswrapper[5072]: I0228 04:13:28.537179 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-67nm2" event={"ID":"c68fedb7-ccc2-4f59-8b91-7a59776ccd1d","Type":"ContainerStarted","Data":"6810b1b67ea37348b2f21fbd87e6607f601a001be27f480d2d1e6af6ead94daa"} Feb 28 04:13:28 crc kubenswrapper[5072]: I0228 04:13:28.551171 5072 generic.go:334] "Generic (PLEG): container finished" podID="cd61caed-be31-4706-9677-0da76d2cb2e7" containerID="b11dc963a97650c629e35b1fa0e4a2599cdccebe30a1e15b2472d2a32ee1dd36" exitCode=0 Feb 28 04:13:28 crc kubenswrapper[5072]: I0228 04:13:28.551324 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2k7bm" event={"ID":"cd61caed-be31-4706-9677-0da76d2cb2e7","Type":"ContainerDied","Data":"b11dc963a97650c629e35b1fa0e4a2599cdccebe30a1e15b2472d2a32ee1dd36"} Feb 28 04:13:28 crc kubenswrapper[5072]: I0228 04:13:28.554102 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2k7bm" event={"ID":"cd61caed-be31-4706-9677-0da76d2cb2e7","Type":"ContainerStarted","Data":"9aff94aad5af4de13bbf65e72a8ce74b1cde16f71835d008bee91127c94811fc"} Feb 28 04:13:28 crc kubenswrapper[5072]: I0228 04:13:28.588665 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b0b253b9-c164-4de4-83f8-bc2387e3d520","Type":"ContainerStarted","Data":"8481f2eeb598f1aad3861cf13f9f5dc9f0f0f5182c9b806a8f77fbe3c7e93e7c"} Feb 28 04:13:28 crc kubenswrapper[5072]: I0228 04:13:28.600505 5072 generic.go:334] "Generic (PLEG): container finished" podID="54b9fa87-83eb-407b-a4cf-dd288ca28bb8" containerID="761e99fd9e283d0c1eefdc9f1dc750568a86027ac2a7c4704c44011de64463f0" exitCode=0 Feb 28 04:13:28 crc kubenswrapper[5072]: I0228 04:13:28.601086 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"54b9fa87-83eb-407b-a4cf-dd288ca28bb8","Type":"ContainerDied","Data":"761e99fd9e283d0c1eefdc9f1dc750568a86027ac2a7c4704c44011de64463f0"} Feb 28 04:13:28 crc kubenswrapper[5072]: I0228 04:13:28.601162 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:28 crc kubenswrapper[5072]: I0228 04:13:28.614876 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.6148515310000002 podStartE2EDuration="2.614851531s" podCreationTimestamp="2026-02-28 04:13:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:28.610854147 +0000 UTC m=+230.605584339" watchObservedRunningTime="2026-02-28 04:13:28.614851531 +0000 UTC m=+230.609581723" Feb 28 04:13:28 crc kubenswrapper[5072]: I0228 04:13:28.615046 5072 patch_prober.go:28] interesting pod/router-default-5444994796-9wdtp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 04:13:28 crc kubenswrapper[5072]: [-]has-synced failed: reason withheld Feb 28 04:13:28 crc kubenswrapper[5072]: [+]process-running ok Feb 28 04:13:28 crc kubenswrapper[5072]: healthz check failed Feb 28 04:13:28 crc kubenswrapper[5072]: I0228 04:13:28.615217 5072 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9wdtp" podUID="49976b40-ea77-4857-a99f-f4a65df82e05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 04:13:28 crc kubenswrapper[5072]: I0228 04:13:28.636653 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" podStartSLOduration=185.636588156 podStartE2EDuration="3m5.636588156s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:13:28.633746157 +0000 UTC m=+230.628476369" watchObservedRunningTime="2026-02-28 04:13:28.636588156 +0000 UTC m=+230.631318338" Feb 28 04:13:29 crc kubenswrapper[5072]: I0228 04:13:29.586119 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:29 crc kubenswrapper[5072]: I0228 04:13:29.590828 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-sb9bc" Feb 28 04:13:29 crc kubenswrapper[5072]: I0228 04:13:29.608635 5072 patch_prober.go:28] interesting pod/router-default-5444994796-9wdtp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 04:13:29 crc kubenswrapper[5072]: [-]has-synced failed: reason withheld Feb 28 04:13:29 crc kubenswrapper[5072]: [+]process-running ok Feb 28 04:13:29 crc kubenswrapper[5072]: healthz check failed Feb 28 04:13:29 crc kubenswrapper[5072]: I0228 04:13:29.608713 5072 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9wdtp" podUID="49976b40-ea77-4857-a99f-f4a65df82e05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 04:13:29 crc kubenswrapper[5072]: I0228 04:13:29.618269 5072 generic.go:334] "Generic (PLEG): container finished" podID="b0b253b9-c164-4de4-83f8-bc2387e3d520" containerID="8481f2eeb598f1aad3861cf13f9f5dc9f0f0f5182c9b806a8f77fbe3c7e93e7c" exitCode=0 Feb 28 04:13:29 crc kubenswrapper[5072]: I0228 04:13:29.618555 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b0b253b9-c164-4de4-83f8-bc2387e3d520","Type":"ContainerDied","Data":"8481f2eeb598f1aad3861cf13f9f5dc9f0f0f5182c9b806a8f77fbe3c7e93e7c"} Feb 28 04:13:30 crc kubenswrapper[5072]: I0228 04:13:30.031533 5072 ???:1] "http: TLS handshake error from 192.168.126.11:42196: no serving certificate available for the kubelet" Feb 28 04:13:30 crc kubenswrapper[5072]: I0228 04:13:30.079804 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 28 04:13:30 crc kubenswrapper[5072]: I0228 04:13:30.188801 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54b9fa87-83eb-407b-a4cf-dd288ca28bb8-kubelet-dir\") pod \"54b9fa87-83eb-407b-a4cf-dd288ca28bb8\" (UID: \"54b9fa87-83eb-407b-a4cf-dd288ca28bb8\") " Feb 28 04:13:30 crc kubenswrapper[5072]: I0228 04:13:30.188947 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/54b9fa87-83eb-407b-a4cf-dd288ca28bb8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "54b9fa87-83eb-407b-a4cf-dd288ca28bb8" (UID: "54b9fa87-83eb-407b-a4cf-dd288ca28bb8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 04:13:30 crc kubenswrapper[5072]: I0228 04:13:30.188975 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54b9fa87-83eb-407b-a4cf-dd288ca28bb8-kube-api-access\") pod \"54b9fa87-83eb-407b-a4cf-dd288ca28bb8\" (UID: \"54b9fa87-83eb-407b-a4cf-dd288ca28bb8\") " Feb 28 04:13:30 crc kubenswrapper[5072]: I0228 04:13:30.189538 5072 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54b9fa87-83eb-407b-a4cf-dd288ca28bb8-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 28 04:13:30 crc kubenswrapper[5072]: I0228 04:13:30.219821 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54b9fa87-83eb-407b-a4cf-dd288ca28bb8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "54b9fa87-83eb-407b-a4cf-dd288ca28bb8" (UID: "54b9fa87-83eb-407b-a4cf-dd288ca28bb8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:13:30 crc kubenswrapper[5072]: I0228 04:13:30.291172 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54b9fa87-83eb-407b-a4cf-dd288ca28bb8-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 28 04:13:30 crc kubenswrapper[5072]: I0228 04:13:30.609153 5072 patch_prober.go:28] interesting pod/router-default-5444994796-9wdtp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 04:13:30 crc kubenswrapper[5072]: [-]has-synced failed: reason withheld Feb 28 04:13:30 crc kubenswrapper[5072]: [+]process-running ok Feb 28 04:13:30 crc kubenswrapper[5072]: healthz check failed Feb 28 04:13:30 crc kubenswrapper[5072]: I0228 04:13:30.609238 5072 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9wdtp" podUID="49976b40-ea77-4857-a99f-f4a65df82e05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 04:13:30 crc kubenswrapper[5072]: I0228 04:13:30.639010 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 28 04:13:30 crc kubenswrapper[5072]: I0228 04:13:30.639237 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"54b9fa87-83eb-407b-a4cf-dd288ca28bb8","Type":"ContainerDied","Data":"cf524f8dde2bf16723275002f1835963e68bc7a8b425b70361c2a87de7f44e08"} Feb 28 04:13:30 crc kubenswrapper[5072]: I0228 04:13:30.639489 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf524f8dde2bf16723275002f1835963e68bc7a8b425b70361c2a87de7f44e08" Feb 28 04:13:30 crc kubenswrapper[5072]: I0228 04:13:30.906627 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 28 04:13:31 crc kubenswrapper[5072]: I0228 04:13:31.000489 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b0b253b9-c164-4de4-83f8-bc2387e3d520-kubelet-dir\") pod \"b0b253b9-c164-4de4-83f8-bc2387e3d520\" (UID: \"b0b253b9-c164-4de4-83f8-bc2387e3d520\") " Feb 28 04:13:31 crc kubenswrapper[5072]: I0228 04:13:31.000765 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b0b253b9-c164-4de4-83f8-bc2387e3d520-kube-api-access\") pod \"b0b253b9-c164-4de4-83f8-bc2387e3d520\" (UID: \"b0b253b9-c164-4de4-83f8-bc2387e3d520\") " Feb 28 04:13:31 crc kubenswrapper[5072]: I0228 04:13:31.001092 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0b253b9-c164-4de4-83f8-bc2387e3d520-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b0b253b9-c164-4de4-83f8-bc2387e3d520" (UID: "b0b253b9-c164-4de4-83f8-bc2387e3d520"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 04:13:31 crc kubenswrapper[5072]: I0228 04:13:31.009233 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0b253b9-c164-4de4-83f8-bc2387e3d520-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b0b253b9-c164-4de4-83f8-bc2387e3d520" (UID: "b0b253b9-c164-4de4-83f8-bc2387e3d520"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:13:31 crc kubenswrapper[5072]: I0228 04:13:31.106592 5072 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b0b253b9-c164-4de4-83f8-bc2387e3d520-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 28 04:13:31 crc kubenswrapper[5072]: I0228 04:13:31.106744 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b0b253b9-c164-4de4-83f8-bc2387e3d520-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 28 04:13:31 crc kubenswrapper[5072]: I0228 04:13:31.609621 5072 patch_prober.go:28] interesting pod/router-default-5444994796-9wdtp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 04:13:31 crc kubenswrapper[5072]: [-]has-synced failed: reason withheld Feb 28 04:13:31 crc kubenswrapper[5072]: [+]process-running ok Feb 28 04:13:31 crc kubenswrapper[5072]: healthz check failed Feb 28 04:13:31 crc kubenswrapper[5072]: I0228 04:13:31.609795 5072 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9wdtp" podUID="49976b40-ea77-4857-a99f-f4a65df82e05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 04:13:31 crc kubenswrapper[5072]: I0228 04:13:31.665327 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b0b253b9-c164-4de4-83f8-bc2387e3d520","Type":"ContainerDied","Data":"01f5ac07290b168d0c3123d70f4d6b9edfd7525fd9f026700900e7bc8145c041"} Feb 28 04:13:31 crc kubenswrapper[5072]: I0228 04:13:31.665389 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01f5ac07290b168d0c3123d70f4d6b9edfd7525fd9f026700900e7bc8145c041" Feb 28 04:13:31 crc kubenswrapper[5072]: I0228 04:13:31.665424 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 28 04:13:31 crc kubenswrapper[5072]: I0228 04:13:31.681290 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wnkn9" Feb 28 04:13:32 crc kubenswrapper[5072]: I0228 04:13:32.316936 5072 ???:1] "http: TLS handshake error from 192.168.126.11:42198: no serving certificate available for the kubelet" Feb 28 04:13:32 crc kubenswrapper[5072]: I0228 04:13:32.608838 5072 patch_prober.go:28] interesting pod/router-default-5444994796-9wdtp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 04:13:32 crc kubenswrapper[5072]: [-]has-synced failed: reason withheld Feb 28 04:13:32 crc kubenswrapper[5072]: [+]process-running ok Feb 28 04:13:32 crc kubenswrapper[5072]: healthz check failed Feb 28 04:13:32 crc kubenswrapper[5072]: I0228 04:13:32.608909 5072 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9wdtp" podUID="49976b40-ea77-4857-a99f-f4a65df82e05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 04:13:33 crc kubenswrapper[5072]: I0228 04:13:33.610935 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-9wdtp" Feb 28 04:13:33 crc kubenswrapper[5072]: I0228 04:13:33.614814 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-9wdtp" Feb 28 04:13:34 crc kubenswrapper[5072]: I0228 04:13:34.351780 5072 patch_prober.go:28] interesting pod/downloads-7954f5f757-mzmcb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Feb 28 04:13:34 crc kubenswrapper[5072]: I0228 04:13:34.352149 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-mzmcb" podUID="dbdad8a2-b26c-4587-8a9c-cdf96b65c15f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Feb 28 04:13:34 crc kubenswrapper[5072]: I0228 04:13:34.351833 5072 patch_prober.go:28] interesting pod/downloads-7954f5f757-mzmcb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Feb 28 04:13:34 crc kubenswrapper[5072]: I0228 04:13:34.352552 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mzmcb" podUID="dbdad8a2-b26c-4587-8a9c-cdf96b65c15f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Feb 28 04:13:35 crc kubenswrapper[5072]: I0228 04:13:35.289962 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-58vm7" Feb 28 04:13:35 crc kubenswrapper[5072]: I0228 04:13:35.295221 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-58vm7" Feb 28 04:13:40 crc kubenswrapper[5072]: I0228 04:13:40.300145 5072 ???:1] "http: TLS handshake error from 192.168.126.11:51280: no serving certificate available for the kubelet" Feb 28 04:13:40 crc kubenswrapper[5072]: I0228 04:13:40.419001 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-554bf44686-cvpls"] Feb 28 04:13:40 crc kubenswrapper[5072]: I0228 04:13:40.419221 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-554bf44686-cvpls" podUID="f06875c2-245b-4e2b-8746-5ebc4ecee456" containerName="controller-manager" containerID="cri-o://668d0462d3b77e594cc3f0740ea98b8f449f61ebc644a9776514a412f621a31a" gracePeriod=30 Feb 28 04:13:40 crc kubenswrapper[5072]: I0228 04:13:40.480494 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs"] Feb 28 04:13:40 crc kubenswrapper[5072]: I0228 04:13:40.480741 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs" podUID="91bd618e-7328-48ac-b2d9-23f924c1405f" containerName="route-controller-manager" containerID="cri-o://5a0386cf627eb07759095a74d0bbdb4ad01e0759a35b1d9d2c3c58c576971184" gracePeriod=30 Feb 28 04:13:40 crc kubenswrapper[5072]: I0228 04:13:40.780255 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/109581ed-36ab-4625-bf7e-bcdecb30e35a-metrics-certs\") pod \"network-metrics-daemon-95gbg\" (UID: \"109581ed-36ab-4625-bf7e-bcdecb30e35a\") " pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:13:40 crc kubenswrapper[5072]: I0228 04:13:40.783429 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 28 04:13:40 crc kubenswrapper[5072]: I0228 04:13:40.797522 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/109581ed-36ab-4625-bf7e-bcdecb30e35a-metrics-certs\") pod \"network-metrics-daemon-95gbg\" (UID: \"109581ed-36ab-4625-bf7e-bcdecb30e35a\") " pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:13:40 crc kubenswrapper[5072]: I0228 04:13:40.972298 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 28 04:13:40 crc kubenswrapper[5072]: I0228 04:13:40.981553 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-95gbg" Feb 28 04:13:41 crc kubenswrapper[5072]: I0228 04:13:41.740889 5072 generic.go:334] "Generic (PLEG): container finished" podID="91bd618e-7328-48ac-b2d9-23f924c1405f" containerID="5a0386cf627eb07759095a74d0bbdb4ad01e0759a35b1d9d2c3c58c576971184" exitCode=0 Feb 28 04:13:41 crc kubenswrapper[5072]: I0228 04:13:41.741426 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs" event={"ID":"91bd618e-7328-48ac-b2d9-23f924c1405f","Type":"ContainerDied","Data":"5a0386cf627eb07759095a74d0bbdb4ad01e0759a35b1d9d2c3c58c576971184"} Feb 28 04:13:41 crc kubenswrapper[5072]: I0228 04:13:41.743352 5072 generic.go:334] "Generic (PLEG): container finished" podID="f06875c2-245b-4e2b-8746-5ebc4ecee456" containerID="668d0462d3b77e594cc3f0740ea98b8f449f61ebc644a9776514a412f621a31a" exitCode=0 Feb 28 04:13:41 crc kubenswrapper[5072]: I0228 04:13:41.743387 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-554bf44686-cvpls" event={"ID":"f06875c2-245b-4e2b-8746-5ebc4ecee456","Type":"ContainerDied","Data":"668d0462d3b77e594cc3f0740ea98b8f449f61ebc644a9776514a412f621a31a"} Feb 28 04:13:42 crc kubenswrapper[5072]: I0228 04:13:42.725029 5072 patch_prober.go:28] interesting pod/controller-manager-554bf44686-cvpls container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Feb 28 04:13:42 crc kubenswrapper[5072]: I0228 04:13:42.725126 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-554bf44686-cvpls" podUID="f06875c2-245b-4e2b-8746-5ebc4ecee456" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Feb 28 04:13:42 crc kubenswrapper[5072]: I0228 04:13:42.752699 5072 patch_prober.go:28] interesting pod/route-controller-manager-74667d9b5d-hz4rs container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" start-of-body= Feb 28 04:13:42 crc kubenswrapper[5072]: I0228 04:13:42.752762 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs" podUID="91bd618e-7328-48ac-b2d9-23f924c1405f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" Feb 28 04:13:44 crc kubenswrapper[5072]: I0228 04:13:44.358563 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-mzmcb" Feb 28 04:13:46 crc kubenswrapper[5072]: I0228 04:13:46.436313 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:13:50 crc kubenswrapper[5072]: I0228 04:13:50.105269 5072 patch_prober.go:28] interesting pod/machine-config-daemon-5lrpf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:13:50 crc kubenswrapper[5072]: I0228 04:13:50.105334 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:13:52 crc kubenswrapper[5072]: I0228 04:13:52.720386 5072 patch_prober.go:28] interesting pod/controller-manager-554bf44686-cvpls container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Feb 28 04:13:52 crc kubenswrapper[5072]: I0228 04:13:52.720798 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-554bf44686-cvpls" podUID="f06875c2-245b-4e2b-8746-5ebc4ecee456" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Feb 28 04:13:52 crc kubenswrapper[5072]: I0228 04:13:52.752800 5072 patch_prober.go:28] interesting pod/route-controller-manager-74667d9b5d-hz4rs container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" start-of-body= Feb 28 04:13:52 crc kubenswrapper[5072]: I0228 04:13:52.752868 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs" podUID="91bd618e-7328-48ac-b2d9-23f924c1405f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" Feb 28 04:13:52 crc kubenswrapper[5072]: I0228 04:13:52.806555 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 04:13:54 crc kubenswrapper[5072]: E0228 04:13:54.131982 5072 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 28 04:13:54 crc kubenswrapper[5072]: E0228 04:13:54.133220 5072 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wkkj4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-b4wcw_openshift-marketplace(9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 28 04:13:54 crc kubenswrapper[5072]: E0228 04:13:54.134378 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-b4wcw" podUID="9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23" Feb 28 04:13:54 crc kubenswrapper[5072]: E0228 04:13:54.848213 5072 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 28 04:13:54 crc kubenswrapper[5072]: E0228 04:13:54.848395 5072 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rhh42,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-bkl6g_openshift-marketplace(9932ea0b-dd67-4ffb-a303-3ba7b97730ef): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 28 04:13:54 crc kubenswrapper[5072]: E0228 04:13:54.849572 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-bkl6g" podUID="9932ea0b-dd67-4ffb-a303-3ba7b97730ef" Feb 28 04:13:55 crc kubenswrapper[5072]: E0228 04:13:55.980276 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bkl6g" podUID="9932ea0b-dd67-4ffb-a303-3ba7b97730ef" Feb 28 04:13:55 crc kubenswrapper[5072]: E0228 04:13:55.980508 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-b4wcw" podUID="9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23" Feb 28 04:13:56 crc kubenswrapper[5072]: I0228 04:13:56.622530 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hklhr" Feb 28 04:13:57 crc kubenswrapper[5072]: E0228 04:13:57.471674 5072 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 28 04:13:57 crc kubenswrapper[5072]: E0228 04:13:57.471919 5072 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4rhc6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-g7hzc_openshift-marketplace(11731378-2c2a-448a-918f-2e2f07619ee0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 28 04:13:57 crc kubenswrapper[5072]: E0228 04:13:57.473904 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-g7hzc" podUID="11731378-2c2a-448a-918f-2e2f07619ee0" Feb 28 04:14:00 crc kubenswrapper[5072]: I0228 04:14:00.143086 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537534-5p5md"] Feb 28 04:14:00 crc kubenswrapper[5072]: E0228 04:14:00.144366 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b253b9-c164-4de4-83f8-bc2387e3d520" containerName="pruner" Feb 28 04:14:00 crc kubenswrapper[5072]: I0228 04:14:00.144461 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b253b9-c164-4de4-83f8-bc2387e3d520" containerName="pruner" Feb 28 04:14:00 crc kubenswrapper[5072]: E0228 04:14:00.144547 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54b9fa87-83eb-407b-a4cf-dd288ca28bb8" containerName="pruner" Feb 28 04:14:00 crc kubenswrapper[5072]: I0228 04:14:00.144628 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="54b9fa87-83eb-407b-a4cf-dd288ca28bb8" containerName="pruner" Feb 28 04:14:00 crc kubenswrapper[5072]: I0228 04:14:00.144949 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="54b9fa87-83eb-407b-a4cf-dd288ca28bb8" containerName="pruner" Feb 28 04:14:00 crc kubenswrapper[5072]: I0228 04:14:00.145026 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b253b9-c164-4de4-83f8-bc2387e3d520" containerName="pruner" Feb 28 04:14:00 crc kubenswrapper[5072]: I0228 04:14:00.145454 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537534-5p5md" Feb 28 04:14:00 crc kubenswrapper[5072]: I0228 04:14:00.150986 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-c8kvx" Feb 28 04:14:00 crc kubenswrapper[5072]: I0228 04:14:00.152132 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537534-5p5md"] Feb 28 04:14:00 crc kubenswrapper[5072]: I0228 04:14:00.327945 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84wzl\" (UniqueName: \"kubernetes.io/projected/a0ab1ca7-1f19-4416-9ebb-95ceffdc14ac-kube-api-access-84wzl\") pod \"auto-csr-approver-29537534-5p5md\" (UID: \"a0ab1ca7-1f19-4416-9ebb-95ceffdc14ac\") " pod="openshift-infra/auto-csr-approver-29537534-5p5md" Feb 28 04:14:00 crc kubenswrapper[5072]: I0228 04:14:00.429038 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84wzl\" (UniqueName: \"kubernetes.io/projected/a0ab1ca7-1f19-4416-9ebb-95ceffdc14ac-kube-api-access-84wzl\") pod \"auto-csr-approver-29537534-5p5md\" (UID: \"a0ab1ca7-1f19-4416-9ebb-95ceffdc14ac\") " pod="openshift-infra/auto-csr-approver-29537534-5p5md" Feb 28 04:14:00 crc kubenswrapper[5072]: I0228 04:14:00.471398 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84wzl\" (UniqueName: \"kubernetes.io/projected/a0ab1ca7-1f19-4416-9ebb-95ceffdc14ac-kube-api-access-84wzl\") pod \"auto-csr-approver-29537534-5p5md\" (UID: \"a0ab1ca7-1f19-4416-9ebb-95ceffdc14ac\") " pod="openshift-infra/auto-csr-approver-29537534-5p5md" Feb 28 04:14:00 crc kubenswrapper[5072]: I0228 04:14:00.769253 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537534-5p5md" Feb 28 04:14:00 crc kubenswrapper[5072]: I0228 04:14:00.845250 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 28 04:14:00 crc kubenswrapper[5072]: I0228 04:14:00.846379 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 28 04:14:00 crc kubenswrapper[5072]: I0228 04:14:00.849075 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 28 04:14:00 crc kubenswrapper[5072]: I0228 04:14:00.849400 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 28 04:14:00 crc kubenswrapper[5072]: I0228 04:14:00.863736 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 28 04:14:01 crc kubenswrapper[5072]: I0228 04:14:01.037791 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a496c146-b799-49dc-a8c7-9339af045983-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a496c146-b799-49dc-a8c7-9339af045983\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 28 04:14:01 crc kubenswrapper[5072]: I0228 04:14:01.037884 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a496c146-b799-49dc-a8c7-9339af045983-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a496c146-b799-49dc-a8c7-9339af045983\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 28 04:14:01 crc kubenswrapper[5072]: I0228 04:14:01.139324 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a496c146-b799-49dc-a8c7-9339af045983-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a496c146-b799-49dc-a8c7-9339af045983\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 28 04:14:01 crc kubenswrapper[5072]: I0228 04:14:01.139426 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a496c146-b799-49dc-a8c7-9339af045983-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a496c146-b799-49dc-a8c7-9339af045983\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 28 04:14:01 crc kubenswrapper[5072]: I0228 04:14:01.139496 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a496c146-b799-49dc-a8c7-9339af045983-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a496c146-b799-49dc-a8c7-9339af045983\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 28 04:14:01 crc kubenswrapper[5072]: I0228 04:14:01.158779 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a496c146-b799-49dc-a8c7-9339af045983-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a496c146-b799-49dc-a8c7-9339af045983\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 28 04:14:01 crc kubenswrapper[5072]: I0228 04:14:01.167133 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 28 04:14:01 crc kubenswrapper[5072]: E0228 04:14:01.664980 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-g7hzc" podUID="11731378-2c2a-448a-918f-2e2f07619ee0" Feb 28 04:14:02 crc kubenswrapper[5072]: I0228 04:14:02.720524 5072 patch_prober.go:28] interesting pod/controller-manager-554bf44686-cvpls container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Feb 28 04:14:02 crc kubenswrapper[5072]: I0228 04:14:02.720711 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-554bf44686-cvpls" podUID="f06875c2-245b-4e2b-8746-5ebc4ecee456" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Feb 28 04:14:02 crc kubenswrapper[5072]: I0228 04:14:02.755045 5072 patch_prober.go:28] interesting pod/route-controller-manager-74667d9b5d-hz4rs container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" start-of-body= Feb 28 04:14:02 crc kubenswrapper[5072]: I0228 04:14:02.755140 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs" podUID="91bd618e-7328-48ac-b2d9-23f924c1405f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" Feb 28 04:14:05 crc kubenswrapper[5072]: E0228 04:14:05.379831 5072 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 28 04:14:05 crc kubenswrapper[5072]: E0228 04:14:05.380758 5072 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mjqsz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-gw7ld_openshift-marketplace(96c8a41b-5700-46e9-bea3-aac12066069f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 28 04:14:05 crc kubenswrapper[5072]: E0228 04:14:05.381899 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-gw7ld" podUID="96c8a41b-5700-46e9-bea3-aac12066069f" Feb 28 04:14:06 crc kubenswrapper[5072]: I0228 04:14:06.242842 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 28 04:14:06 crc kubenswrapper[5072]: I0228 04:14:06.244428 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 28 04:14:06 crc kubenswrapper[5072]: I0228 04:14:06.271567 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 28 04:14:06 crc kubenswrapper[5072]: I0228 04:14:06.431439 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e202e5af-1037-43ae-968e-0e594828048e-kube-api-access\") pod \"installer-9-crc\" (UID: \"e202e5af-1037-43ae-968e-0e594828048e\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 28 04:14:06 crc kubenswrapper[5072]: I0228 04:14:06.431536 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e202e5af-1037-43ae-968e-0e594828048e-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e202e5af-1037-43ae-968e-0e594828048e\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 28 04:14:06 crc kubenswrapper[5072]: I0228 04:14:06.431566 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e202e5af-1037-43ae-968e-0e594828048e-var-lock\") pod \"installer-9-crc\" (UID: \"e202e5af-1037-43ae-968e-0e594828048e\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 28 04:14:06 crc kubenswrapper[5072]: I0228 04:14:06.532819 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e202e5af-1037-43ae-968e-0e594828048e-var-lock\") pod \"installer-9-crc\" (UID: \"e202e5af-1037-43ae-968e-0e594828048e\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 28 04:14:06 crc kubenswrapper[5072]: I0228 04:14:06.532973 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e202e5af-1037-43ae-968e-0e594828048e-kube-api-access\") pod \"installer-9-crc\" (UID: \"e202e5af-1037-43ae-968e-0e594828048e\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 28 04:14:06 crc kubenswrapper[5072]: I0228 04:14:06.533018 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e202e5af-1037-43ae-968e-0e594828048e-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e202e5af-1037-43ae-968e-0e594828048e\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 28 04:14:06 crc kubenswrapper[5072]: I0228 04:14:06.532969 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e202e5af-1037-43ae-968e-0e594828048e-var-lock\") pod \"installer-9-crc\" (UID: \"e202e5af-1037-43ae-968e-0e594828048e\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 28 04:14:06 crc kubenswrapper[5072]: I0228 04:14:06.533102 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e202e5af-1037-43ae-968e-0e594828048e-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e202e5af-1037-43ae-968e-0e594828048e\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 28 04:14:06 crc kubenswrapper[5072]: I0228 04:14:06.556680 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e202e5af-1037-43ae-968e-0e594828048e-kube-api-access\") pod \"installer-9-crc\" (UID: \"e202e5af-1037-43ae-968e-0e594828048e\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 28 04:14:06 crc kubenswrapper[5072]: I0228 04:14:06.570427 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 28 04:14:07 crc kubenswrapper[5072]: E0228 04:14:07.595425 5072 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 28 04:14:07 crc kubenswrapper[5072]: E0228 04:14:07.596058 5072 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8td77,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-cpbd5_openshift-marketplace(a0f264e9-eb98-4462-8a17-0dc4071f6b96): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 28 04:14:07 crc kubenswrapper[5072]: E0228 04:14:07.597788 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-cpbd5" podUID="a0f264e9-eb98-4462-8a17-0dc4071f6b96" Feb 28 04:14:12 crc kubenswrapper[5072]: I0228 04:14:12.753106 5072 patch_prober.go:28] interesting pod/route-controller-manager-74667d9b5d-hz4rs container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" start-of-body= Feb 28 04:14:12 crc kubenswrapper[5072]: I0228 04:14:12.753742 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs" podUID="91bd618e-7328-48ac-b2d9-23f924c1405f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" Feb 28 04:14:13 crc kubenswrapper[5072]: I0228 04:14:13.719886 5072 patch_prober.go:28] interesting pod/controller-manager-554bf44686-cvpls container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 28 04:14:13 crc kubenswrapper[5072]: I0228 04:14:13.719970 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-554bf44686-cvpls" podUID="f06875c2-245b-4e2b-8746-5ebc4ecee456" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 28 04:14:14 crc kubenswrapper[5072]: E0228 04:14:14.137819 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-cpbd5" podUID="a0f264e9-eb98-4462-8a17-0dc4071f6b96" Feb 28 04:14:14 crc kubenswrapper[5072]: E0228 04:14:14.138156 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-gw7ld" podUID="96c8a41b-5700-46e9-bea3-aac12066069f" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.236409 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-554bf44686-cvpls" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.262447 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-66c49549b6-9qlzl"] Feb 28 04:14:14 crc kubenswrapper[5072]: E0228 04:14:14.263088 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f06875c2-245b-4e2b-8746-5ebc4ecee456" containerName="controller-manager" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.263101 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="f06875c2-245b-4e2b-8746-5ebc4ecee456" containerName="controller-manager" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.263217 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="f06875c2-245b-4e2b-8746-5ebc4ecee456" containerName="controller-manager" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.263591 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66c49549b6-9qlzl" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.275416 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66c49549b6-9qlzl"] Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.364296 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkxrk\" (UniqueName: \"kubernetes.io/projected/f06875c2-245b-4e2b-8746-5ebc4ecee456-kube-api-access-kkxrk\") pod \"f06875c2-245b-4e2b-8746-5ebc4ecee456\" (UID: \"f06875c2-245b-4e2b-8746-5ebc4ecee456\") " Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.364377 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f06875c2-245b-4e2b-8746-5ebc4ecee456-proxy-ca-bundles\") pod \"f06875c2-245b-4e2b-8746-5ebc4ecee456\" (UID: \"f06875c2-245b-4e2b-8746-5ebc4ecee456\") " Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.364691 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f06875c2-245b-4e2b-8746-5ebc4ecee456-client-ca\") pod \"f06875c2-245b-4e2b-8746-5ebc4ecee456\" (UID: \"f06875c2-245b-4e2b-8746-5ebc4ecee456\") " Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.364789 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f06875c2-245b-4e2b-8746-5ebc4ecee456-config\") pod \"f06875c2-245b-4e2b-8746-5ebc4ecee456\" (UID: \"f06875c2-245b-4e2b-8746-5ebc4ecee456\") " Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.364958 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f06875c2-245b-4e2b-8746-5ebc4ecee456-serving-cert\") pod \"f06875c2-245b-4e2b-8746-5ebc4ecee456\" (UID: \"f06875c2-245b-4e2b-8746-5ebc4ecee456\") " Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.365379 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0cba0320-0aa0-4b0b-91bc-528003822f83-client-ca\") pod \"controller-manager-66c49549b6-9qlzl\" (UID: \"0cba0320-0aa0-4b0b-91bc-528003822f83\") " pod="openshift-controller-manager/controller-manager-66c49549b6-9qlzl" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.365481 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cba0320-0aa0-4b0b-91bc-528003822f83-config\") pod \"controller-manager-66c49549b6-9qlzl\" (UID: \"0cba0320-0aa0-4b0b-91bc-528003822f83\") " pod="openshift-controller-manager/controller-manager-66c49549b6-9qlzl" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.365519 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f06875c2-245b-4e2b-8746-5ebc4ecee456-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f06875c2-245b-4e2b-8746-5ebc4ecee456" (UID: "f06875c2-245b-4e2b-8746-5ebc4ecee456"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.365677 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5twz9\" (UniqueName: \"kubernetes.io/projected/0cba0320-0aa0-4b0b-91bc-528003822f83-kube-api-access-5twz9\") pod \"controller-manager-66c49549b6-9qlzl\" (UID: \"0cba0320-0aa0-4b0b-91bc-528003822f83\") " pod="openshift-controller-manager/controller-manager-66c49549b6-9qlzl" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.365772 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cba0320-0aa0-4b0b-91bc-528003822f83-serving-cert\") pod \"controller-manager-66c49549b6-9qlzl\" (UID: \"0cba0320-0aa0-4b0b-91bc-528003822f83\") " pod="openshift-controller-manager/controller-manager-66c49549b6-9qlzl" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.365771 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f06875c2-245b-4e2b-8746-5ebc4ecee456-client-ca" (OuterVolumeSpecName: "client-ca") pod "f06875c2-245b-4e2b-8746-5ebc4ecee456" (UID: "f06875c2-245b-4e2b-8746-5ebc4ecee456"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.365933 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0cba0320-0aa0-4b0b-91bc-528003822f83-proxy-ca-bundles\") pod \"controller-manager-66c49549b6-9qlzl\" (UID: \"0cba0320-0aa0-4b0b-91bc-528003822f83\") " pod="openshift-controller-manager/controller-manager-66c49549b6-9qlzl" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.366034 5072 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f06875c2-245b-4e2b-8746-5ebc4ecee456-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.366061 5072 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f06875c2-245b-4e2b-8746-5ebc4ecee456-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.366064 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f06875c2-245b-4e2b-8746-5ebc4ecee456-config" (OuterVolumeSpecName: "config") pod "f06875c2-245b-4e2b-8746-5ebc4ecee456" (UID: "f06875c2-245b-4e2b-8746-5ebc4ecee456"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.372334 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f06875c2-245b-4e2b-8746-5ebc4ecee456-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f06875c2-245b-4e2b-8746-5ebc4ecee456" (UID: "f06875c2-245b-4e2b-8746-5ebc4ecee456"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.372916 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f06875c2-245b-4e2b-8746-5ebc4ecee456-kube-api-access-kkxrk" (OuterVolumeSpecName: "kube-api-access-kkxrk") pod "f06875c2-245b-4e2b-8746-5ebc4ecee456" (UID: "f06875c2-245b-4e2b-8746-5ebc4ecee456"). InnerVolumeSpecName "kube-api-access-kkxrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:14:14 crc kubenswrapper[5072]: E0228 04:14:14.401379 5072 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 28 04:14:14 crc kubenswrapper[5072]: E0228 04:14:14.401623 5072 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-phxxs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-g7wqp_openshift-marketplace(1917348a-ad88-41e1-a1f1-215706769c5e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 28 04:14:14 crc kubenswrapper[5072]: E0228 04:14:14.402917 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-g7wqp" podUID="1917348a-ad88-41e1-a1f1-215706769c5e" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.467711 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cba0320-0aa0-4b0b-91bc-528003822f83-serving-cert\") pod \"controller-manager-66c49549b6-9qlzl\" (UID: \"0cba0320-0aa0-4b0b-91bc-528003822f83\") " pod="openshift-controller-manager/controller-manager-66c49549b6-9qlzl" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.467816 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0cba0320-0aa0-4b0b-91bc-528003822f83-proxy-ca-bundles\") pod \"controller-manager-66c49549b6-9qlzl\" (UID: \"0cba0320-0aa0-4b0b-91bc-528003822f83\") " pod="openshift-controller-manager/controller-manager-66c49549b6-9qlzl" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.467899 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0cba0320-0aa0-4b0b-91bc-528003822f83-client-ca\") pod \"controller-manager-66c49549b6-9qlzl\" (UID: \"0cba0320-0aa0-4b0b-91bc-528003822f83\") " pod="openshift-controller-manager/controller-manager-66c49549b6-9qlzl" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.467933 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cba0320-0aa0-4b0b-91bc-528003822f83-config\") pod \"controller-manager-66c49549b6-9qlzl\" (UID: \"0cba0320-0aa0-4b0b-91bc-528003822f83\") " pod="openshift-controller-manager/controller-manager-66c49549b6-9qlzl" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.467979 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5twz9\" (UniqueName: \"kubernetes.io/projected/0cba0320-0aa0-4b0b-91bc-528003822f83-kube-api-access-5twz9\") pod \"controller-manager-66c49549b6-9qlzl\" (UID: \"0cba0320-0aa0-4b0b-91bc-528003822f83\") " pod="openshift-controller-manager/controller-manager-66c49549b6-9qlzl" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.468031 5072 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f06875c2-245b-4e2b-8746-5ebc4ecee456-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.468056 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkxrk\" (UniqueName: \"kubernetes.io/projected/f06875c2-245b-4e2b-8746-5ebc4ecee456-kube-api-access-kkxrk\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.468071 5072 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f06875c2-245b-4e2b-8746-5ebc4ecee456-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.469973 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0cba0320-0aa0-4b0b-91bc-528003822f83-proxy-ca-bundles\") pod \"controller-manager-66c49549b6-9qlzl\" (UID: \"0cba0320-0aa0-4b0b-91bc-528003822f83\") " pod="openshift-controller-manager/controller-manager-66c49549b6-9qlzl" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.470022 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0cba0320-0aa0-4b0b-91bc-528003822f83-client-ca\") pod \"controller-manager-66c49549b6-9qlzl\" (UID: \"0cba0320-0aa0-4b0b-91bc-528003822f83\") " pod="openshift-controller-manager/controller-manager-66c49549b6-9qlzl" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.470525 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cba0320-0aa0-4b0b-91bc-528003822f83-config\") pod \"controller-manager-66c49549b6-9qlzl\" (UID: \"0cba0320-0aa0-4b0b-91bc-528003822f83\") " pod="openshift-controller-manager/controller-manager-66c49549b6-9qlzl" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.474507 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cba0320-0aa0-4b0b-91bc-528003822f83-serving-cert\") pod \"controller-manager-66c49549b6-9qlzl\" (UID: \"0cba0320-0aa0-4b0b-91bc-528003822f83\") " pod="openshift-controller-manager/controller-manager-66c49549b6-9qlzl" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.483155 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5twz9\" (UniqueName: \"kubernetes.io/projected/0cba0320-0aa0-4b0b-91bc-528003822f83-kube-api-access-5twz9\") pod \"controller-manager-66c49549b6-9qlzl\" (UID: \"0cba0320-0aa0-4b0b-91bc-528003822f83\") " pod="openshift-controller-manager/controller-manager-66c49549b6-9qlzl" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.584512 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66c49549b6-9qlzl" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.978566 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-554bf44686-cvpls" event={"ID":"f06875c2-245b-4e2b-8746-5ebc4ecee456","Type":"ContainerDied","Data":"461d9d1615e27063f32c28e0ed1aff79d25b78c494a897bae58c81659ee32428"} Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.978598 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-554bf44686-cvpls" Feb 28 04:14:14 crc kubenswrapper[5072]: I0228 04:14:14.978620 5072 scope.go:117] "RemoveContainer" containerID="668d0462d3b77e594cc3f0740ea98b8f449f61ebc644a9776514a412f621a31a" Feb 28 04:14:15 crc kubenswrapper[5072]: I0228 04:14:15.078004 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-554bf44686-cvpls"] Feb 28 04:14:15 crc kubenswrapper[5072]: I0228 04:14:15.081248 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-554bf44686-cvpls"] Feb 28 04:14:15 crc kubenswrapper[5072]: I0228 04:14:15.169546 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-95gbg"] Feb 28 04:14:16 crc kubenswrapper[5072]: I0228 04:14:16.666610 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f06875c2-245b-4e2b-8746-5ebc4ecee456" path="/var/lib/kubelet/pods/f06875c2-245b-4e2b-8746-5ebc4ecee456/volumes" Feb 28 04:14:18 crc kubenswrapper[5072]: E0228 04:14:18.597256 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-g7wqp" podUID="1917348a-ad88-41e1-a1f1-215706769c5e" Feb 28 04:14:18 crc kubenswrapper[5072]: I0228 04:14:18.649171 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs" Feb 28 04:14:18 crc kubenswrapper[5072]: E0228 04:14:18.726386 5072 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 28 04:14:18 crc kubenswrapper[5072]: I0228 04:14:18.726528 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85f8f6c798-w7c4t"] Feb 28 04:14:18 crc kubenswrapper[5072]: E0228 04:14:18.727212 5072 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pqcmq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-2k7bm_openshift-marketplace(cd61caed-be31-4706-9677-0da76d2cb2e7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 28 04:14:18 crc kubenswrapper[5072]: E0228 04:14:18.727371 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91bd618e-7328-48ac-b2d9-23f924c1405f" containerName="route-controller-manager" Feb 28 04:14:18 crc kubenswrapper[5072]: I0228 04:14:18.727388 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="91bd618e-7328-48ac-b2d9-23f924c1405f" containerName="route-controller-manager" Feb 28 04:14:18 crc kubenswrapper[5072]: I0228 04:14:18.727494 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="91bd618e-7328-48ac-b2d9-23f924c1405f" containerName="route-controller-manager" Feb 28 04:14:18 crc kubenswrapper[5072]: I0228 04:14:18.727924 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85f8f6c798-w7c4t"] Feb 28 04:14:18 crc kubenswrapper[5072]: I0228 04:14:18.728008 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85f8f6c798-w7c4t" Feb 28 04:14:18 crc kubenswrapper[5072]: E0228 04:14:18.728944 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-2k7bm" podUID="cd61caed-be31-4706-9677-0da76d2cb2e7" Feb 28 04:14:18 crc kubenswrapper[5072]: I0228 04:14:18.827442 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91bd618e-7328-48ac-b2d9-23f924c1405f-config\") pod \"91bd618e-7328-48ac-b2d9-23f924c1405f\" (UID: \"91bd618e-7328-48ac-b2d9-23f924c1405f\") " Feb 28 04:14:18 crc kubenswrapper[5072]: I0228 04:14:18.827755 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl5mh\" (UniqueName: \"kubernetes.io/projected/91bd618e-7328-48ac-b2d9-23f924c1405f-kube-api-access-sl5mh\") pod \"91bd618e-7328-48ac-b2d9-23f924c1405f\" (UID: \"91bd618e-7328-48ac-b2d9-23f924c1405f\") " Feb 28 04:14:18 crc kubenswrapper[5072]: I0228 04:14:18.827853 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91bd618e-7328-48ac-b2d9-23f924c1405f-serving-cert\") pod \"91bd618e-7328-48ac-b2d9-23f924c1405f\" (UID: \"91bd618e-7328-48ac-b2d9-23f924c1405f\") " Feb 28 04:14:18 crc kubenswrapper[5072]: I0228 04:14:18.827905 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/91bd618e-7328-48ac-b2d9-23f924c1405f-client-ca\") pod \"91bd618e-7328-48ac-b2d9-23f924c1405f\" (UID: \"91bd618e-7328-48ac-b2d9-23f924c1405f\") " Feb 28 04:14:18 crc kubenswrapper[5072]: I0228 04:14:18.828127 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28e0dee6-4ad4-493f-971c-040de863f31f-serving-cert\") pod \"route-controller-manager-85f8f6c798-w7c4t\" (UID: \"28e0dee6-4ad4-493f-971c-040de863f31f\") " pod="openshift-route-controller-manager/route-controller-manager-85f8f6c798-w7c4t" Feb 28 04:14:18 crc kubenswrapper[5072]: I0228 04:14:18.828264 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28e0dee6-4ad4-493f-971c-040de863f31f-client-ca\") pod \"route-controller-manager-85f8f6c798-w7c4t\" (UID: \"28e0dee6-4ad4-493f-971c-040de863f31f\") " pod="openshift-route-controller-manager/route-controller-manager-85f8f6c798-w7c4t" Feb 28 04:14:18 crc kubenswrapper[5072]: I0228 04:14:18.828284 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28e0dee6-4ad4-493f-971c-040de863f31f-config\") pod \"route-controller-manager-85f8f6c798-w7c4t\" (UID: \"28e0dee6-4ad4-493f-971c-040de863f31f\") " pod="openshift-route-controller-manager/route-controller-manager-85f8f6c798-w7c4t" Feb 28 04:14:18 crc kubenswrapper[5072]: I0228 04:14:18.828489 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-459cd\" (UniqueName: \"kubernetes.io/projected/28e0dee6-4ad4-493f-971c-040de863f31f-kube-api-access-459cd\") pod \"route-controller-manager-85f8f6c798-w7c4t\" (UID: \"28e0dee6-4ad4-493f-971c-040de863f31f\") " pod="openshift-route-controller-manager/route-controller-manager-85f8f6c798-w7c4t" Feb 28 04:14:18 crc kubenswrapper[5072]: I0228 04:14:18.829215 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91bd618e-7328-48ac-b2d9-23f924c1405f-config" (OuterVolumeSpecName: "config") pod "91bd618e-7328-48ac-b2d9-23f924c1405f" (UID: "91bd618e-7328-48ac-b2d9-23f924c1405f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:14:18 crc kubenswrapper[5072]: I0228 04:14:18.829231 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91bd618e-7328-48ac-b2d9-23f924c1405f-client-ca" (OuterVolumeSpecName: "client-ca") pod "91bd618e-7328-48ac-b2d9-23f924c1405f" (UID: "91bd618e-7328-48ac-b2d9-23f924c1405f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:14:18 crc kubenswrapper[5072]: I0228 04:14:18.836407 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91bd618e-7328-48ac-b2d9-23f924c1405f-kube-api-access-sl5mh" (OuterVolumeSpecName: "kube-api-access-sl5mh") pod "91bd618e-7328-48ac-b2d9-23f924c1405f" (UID: "91bd618e-7328-48ac-b2d9-23f924c1405f"). InnerVolumeSpecName "kube-api-access-sl5mh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:14:18 crc kubenswrapper[5072]: I0228 04:14:18.837187 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91bd618e-7328-48ac-b2d9-23f924c1405f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "91bd618e-7328-48ac-b2d9-23f924c1405f" (UID: "91bd618e-7328-48ac-b2d9-23f924c1405f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:14:18 crc kubenswrapper[5072]: I0228 04:14:18.935615 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-459cd\" (UniqueName: \"kubernetes.io/projected/28e0dee6-4ad4-493f-971c-040de863f31f-kube-api-access-459cd\") pod \"route-controller-manager-85f8f6c798-w7c4t\" (UID: \"28e0dee6-4ad4-493f-971c-040de863f31f\") " pod="openshift-route-controller-manager/route-controller-manager-85f8f6c798-w7c4t" Feb 28 04:14:18 crc kubenswrapper[5072]: I0228 04:14:18.936156 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28e0dee6-4ad4-493f-971c-040de863f31f-serving-cert\") pod \"route-controller-manager-85f8f6c798-w7c4t\" (UID: \"28e0dee6-4ad4-493f-971c-040de863f31f\") " pod="openshift-route-controller-manager/route-controller-manager-85f8f6c798-w7c4t" Feb 28 04:14:18 crc kubenswrapper[5072]: I0228 04:14:18.936345 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28e0dee6-4ad4-493f-971c-040de863f31f-client-ca\") pod \"route-controller-manager-85f8f6c798-w7c4t\" (UID: \"28e0dee6-4ad4-493f-971c-040de863f31f\") " pod="openshift-route-controller-manager/route-controller-manager-85f8f6c798-w7c4t" Feb 28 04:14:18 crc kubenswrapper[5072]: I0228 04:14:18.936377 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28e0dee6-4ad4-493f-971c-040de863f31f-config\") pod \"route-controller-manager-85f8f6c798-w7c4t\" (UID: \"28e0dee6-4ad4-493f-971c-040de863f31f\") " pod="openshift-route-controller-manager/route-controller-manager-85f8f6c798-w7c4t" Feb 28 04:14:18 crc kubenswrapper[5072]: I0228 04:14:18.936686 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sl5mh\" (UniqueName: \"kubernetes.io/projected/91bd618e-7328-48ac-b2d9-23f924c1405f-kube-api-access-sl5mh\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:18 crc kubenswrapper[5072]: I0228 04:14:18.937510 5072 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91bd618e-7328-48ac-b2d9-23f924c1405f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:18 crc kubenswrapper[5072]: I0228 04:14:18.937543 5072 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/91bd618e-7328-48ac-b2d9-23f924c1405f-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:18 crc kubenswrapper[5072]: I0228 04:14:18.937558 5072 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91bd618e-7328-48ac-b2d9-23f924c1405f-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:18 crc kubenswrapper[5072]: I0228 04:14:18.938842 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28e0dee6-4ad4-493f-971c-040de863f31f-client-ca\") pod \"route-controller-manager-85f8f6c798-w7c4t\" (UID: \"28e0dee6-4ad4-493f-971c-040de863f31f\") " pod="openshift-route-controller-manager/route-controller-manager-85f8f6c798-w7c4t" Feb 28 04:14:18 crc kubenswrapper[5072]: I0228 04:14:18.939126 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28e0dee6-4ad4-493f-971c-040de863f31f-config\") pod \"route-controller-manager-85f8f6c798-w7c4t\" (UID: \"28e0dee6-4ad4-493f-971c-040de863f31f\") " pod="openshift-route-controller-manager/route-controller-manager-85f8f6c798-w7c4t" Feb 28 04:14:18 crc kubenswrapper[5072]: I0228 04:14:18.956178 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28e0dee6-4ad4-493f-971c-040de863f31f-serving-cert\") pod \"route-controller-manager-85f8f6c798-w7c4t\" (UID: \"28e0dee6-4ad4-493f-971c-040de863f31f\") " pod="openshift-route-controller-manager/route-controller-manager-85f8f6c798-w7c4t" Feb 28 04:14:18 crc kubenswrapper[5072]: I0228 04:14:18.965843 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-459cd\" (UniqueName: \"kubernetes.io/projected/28e0dee6-4ad4-493f-971c-040de863f31f-kube-api-access-459cd\") pod \"route-controller-manager-85f8f6c798-w7c4t\" (UID: \"28e0dee6-4ad4-493f-971c-040de863f31f\") " pod="openshift-route-controller-manager/route-controller-manager-85f8f6c798-w7c4t" Feb 28 04:14:18 crc kubenswrapper[5072]: E0228 04:14:18.966051 5072 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 28 04:14:18 crc kubenswrapper[5072]: E0228 04:14:18.966233 5072 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6c7n7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-67nm2_openshift-marketplace(c68fedb7-ccc2-4f59-8b91-7a59776ccd1d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 28 04:14:18 crc kubenswrapper[5072]: E0228 04:14:18.967423 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-67nm2" podUID="c68fedb7-ccc2-4f59-8b91-7a59776ccd1d" Feb 28 04:14:19 crc kubenswrapper[5072]: I0228 04:14:19.000293 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs" event={"ID":"91bd618e-7328-48ac-b2d9-23f924c1405f","Type":"ContainerDied","Data":"9205f742f32d86e083b5ec9cd361075e5e65f35ed8297c39654c75839ad98d50"} Feb 28 04:14:19 crc kubenswrapper[5072]: I0228 04:14:19.000348 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs" Feb 28 04:14:19 crc kubenswrapper[5072]: I0228 04:14:19.049674 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85f8f6c798-w7c4t" Feb 28 04:14:19 crc kubenswrapper[5072]: I0228 04:14:19.064149 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs"] Feb 28 04:14:19 crc kubenswrapper[5072]: I0228 04:14:19.066949 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74667d9b5d-hz4rs"] Feb 28 04:14:19 crc kubenswrapper[5072]: E0228 04:14:19.826733 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-2k7bm" podUID="cd61caed-be31-4706-9677-0da76d2cb2e7" Feb 28 04:14:19 crc kubenswrapper[5072]: E0228 04:14:19.826764 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-67nm2" podUID="c68fedb7-ccc2-4f59-8b91-7a59776ccd1d" Feb 28 04:14:19 crc kubenswrapper[5072]: W0228 04:14:19.830668 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod109581ed_36ab_4625_bf7e_bcdecb30e35a.slice/crio-611f4451a0ec4813e74887122615517a55bc3ef4198077b92dcb3b699cc77a85 WatchSource:0}: Error finding container 611f4451a0ec4813e74887122615517a55bc3ef4198077b92dcb3b699cc77a85: Status 404 returned error can't find the container with id 611f4451a0ec4813e74887122615517a55bc3ef4198077b92dcb3b699cc77a85 Feb 28 04:14:19 crc kubenswrapper[5072]: I0228 04:14:19.849392 5072 scope.go:117] "RemoveContainer" containerID="5a0386cf627eb07759095a74d0bbdb4ad01e0759a35b1d9d2c3c58c576971184" Feb 28 04:14:20 crc kubenswrapper[5072]: I0228 04:14:20.008108 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-95gbg" event={"ID":"109581ed-36ab-4625-bf7e-bcdecb30e35a","Type":"ContainerStarted","Data":"611f4451a0ec4813e74887122615517a55bc3ef4198077b92dcb3b699cc77a85"} Feb 28 04:14:20 crc kubenswrapper[5072]: I0228 04:14:20.105483 5072 patch_prober.go:28] interesting pod/machine-config-daemon-5lrpf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:14:20 crc kubenswrapper[5072]: I0228 04:14:20.106146 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:14:20 crc kubenswrapper[5072]: I0228 04:14:20.106212 5072 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" Feb 28 04:14:20 crc kubenswrapper[5072]: I0228 04:14:20.109035 5072 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d"} pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 28 04:14:20 crc kubenswrapper[5072]: I0228 04:14:20.109249 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" containerID="cri-o://719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d" gracePeriod=600 Feb 28 04:14:20 crc kubenswrapper[5072]: I0228 04:14:20.168827 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85f8f6c798-w7c4t"] Feb 28 04:14:20 crc kubenswrapper[5072]: W0228 04:14:20.182334 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28e0dee6_4ad4_493f_971c_040de863f31f.slice/crio-43ef2428373774a3424d7effa6e78e16cee9049480ee15a69474c78eb03b09cf WatchSource:0}: Error finding container 43ef2428373774a3424d7effa6e78e16cee9049480ee15a69474c78eb03b09cf: Status 404 returned error can't find the container with id 43ef2428373774a3424d7effa6e78e16cee9049480ee15a69474c78eb03b09cf Feb 28 04:14:20 crc kubenswrapper[5072]: E0228 04:14:20.206522 5072 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda035bbab_1d8f_4120_aaf7_88984d936939.slice/crio-719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d.scope\": RecentStats: unable to find data in memory cache]" Feb 28 04:14:20 crc kubenswrapper[5072]: I0228 04:14:20.251633 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537534-5p5md"] Feb 28 04:14:20 crc kubenswrapper[5072]: I0228 04:14:20.253028 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 28 04:14:20 crc kubenswrapper[5072]: I0228 04:14:20.320995 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66c49549b6-9qlzl"] Feb 28 04:14:20 crc kubenswrapper[5072]: W0228 04:14:20.323754 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cba0320_0aa0_4b0b_91bc_528003822f83.slice/crio-80bc84ba445e5088fce0e917cfe38c86a625c0ab2598eff6312dbde2bc9ad7e2 WatchSource:0}: Error finding container 80bc84ba445e5088fce0e917cfe38c86a625c0ab2598eff6312dbde2bc9ad7e2: Status 404 returned error can't find the container with id 80bc84ba445e5088fce0e917cfe38c86a625c0ab2598eff6312dbde2bc9ad7e2 Feb 28 04:14:20 crc kubenswrapper[5072]: I0228 04:14:20.333414 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 28 04:14:20 crc kubenswrapper[5072]: W0228 04:14:20.351366 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode202e5af_1037_43ae_968e_0e594828048e.slice/crio-c96b8891c9cf5455324d50ec7bd49628a9089e778476f33bbe987de99c153012 WatchSource:0}: Error finding container c96b8891c9cf5455324d50ec7bd49628a9089e778476f33bbe987de99c153012: Status 404 returned error can't find the container with id c96b8891c9cf5455324d50ec7bd49628a9089e778476f33bbe987de99c153012 Feb 28 04:14:20 crc kubenswrapper[5072]: I0228 04:14:20.666858 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91bd618e-7328-48ac-b2d9-23f924c1405f" path="/var/lib/kubelet/pods/91bd618e-7328-48ac-b2d9-23f924c1405f/volumes" Feb 28 04:14:20 crc kubenswrapper[5072]: E0228 04:14:20.822596 5072 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 28 04:14:20 crc kubenswrapper[5072]: E0228 04:14:20.823113 5072 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 28 04:14:20 crc kubenswrapper[5072]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 28 04:14:20 crc kubenswrapper[5072]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kmxjs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29537532-qwbxr_openshift-infra(c5d29ab8-044b-4fc5-b5eb-02c5ac608dac): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Feb 28 04:14:20 crc kubenswrapper[5072]: > logger="UnhandledError" Feb 28 04:14:20 crc kubenswrapper[5072]: E0228 04:14:20.824299 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29537532-qwbxr" podUID="c5d29ab8-044b-4fc5-b5eb-02c5ac608dac" Feb 28 04:14:21 crc kubenswrapper[5072]: I0228 04:14:21.028011 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537534-5p5md" event={"ID":"a0ab1ca7-1f19-4416-9ebb-95ceffdc14ac","Type":"ContainerStarted","Data":"09c32bc8294103f39e7710c728979897ef91790aac8ab5cb38c48c9fd2da68d7"} Feb 28 04:14:21 crc kubenswrapper[5072]: I0228 04:14:21.029625 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e202e5af-1037-43ae-968e-0e594828048e","Type":"ContainerStarted","Data":"02231ede3d004e870f63d7e1d9de10eed2d61f06b210f1fbb725318b44d602b7"} Feb 28 04:14:21 crc kubenswrapper[5072]: I0228 04:14:21.029701 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e202e5af-1037-43ae-968e-0e594828048e","Type":"ContainerStarted","Data":"c96b8891c9cf5455324d50ec7bd49628a9089e778476f33bbe987de99c153012"} Feb 28 04:14:21 crc kubenswrapper[5072]: I0228 04:14:21.032521 5072 generic.go:334] "Generic (PLEG): container finished" podID="a035bbab-1d8f-4120-aaf7-88984d936939" containerID="719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d" exitCode=0 Feb 28 04:14:21 crc kubenswrapper[5072]: I0228 04:14:21.032585 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" event={"ID":"a035bbab-1d8f-4120-aaf7-88984d936939","Type":"ContainerDied","Data":"719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d"} Feb 28 04:14:21 crc kubenswrapper[5072]: I0228 04:14:21.035845 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66c49549b6-9qlzl" event={"ID":"0cba0320-0aa0-4b0b-91bc-528003822f83","Type":"ContainerStarted","Data":"8666bac49b8a01a0108c6c57e8cae369620fba8f9177596a43e844ba35a07b41"} Feb 28 04:14:21 crc kubenswrapper[5072]: I0228 04:14:21.035879 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66c49549b6-9qlzl" event={"ID":"0cba0320-0aa0-4b0b-91bc-528003822f83","Type":"ContainerStarted","Data":"80bc84ba445e5088fce0e917cfe38c86a625c0ab2598eff6312dbde2bc9ad7e2"} Feb 28 04:14:21 crc kubenswrapper[5072]: I0228 04:14:21.036564 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-66c49549b6-9qlzl" Feb 28 04:14:21 crc kubenswrapper[5072]: I0228 04:14:21.053207 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a496c146-b799-49dc-a8c7-9339af045983","Type":"ContainerStarted","Data":"10bc6648c6cb08865827329d87dc9ca69f71090c91ecefc616f2a003bcdae35e"} Feb 28 04:14:21 crc kubenswrapper[5072]: I0228 04:14:21.053258 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a496c146-b799-49dc-a8c7-9339af045983","Type":"ContainerStarted","Data":"21d36fef69e7f1b549524dae488dd19a8a05d0f2bc3e943b1e3c14e9c0ba139f"} Feb 28 04:14:21 crc kubenswrapper[5072]: I0228 04:14:21.056575 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85f8f6c798-w7c4t" event={"ID":"28e0dee6-4ad4-493f-971c-040de863f31f","Type":"ContainerStarted","Data":"f3e88a3e73e05d70ece61a038b1e2aa563238c668a81d161808f04bdcf6e775b"} Feb 28 04:14:21 crc kubenswrapper[5072]: I0228 04:14:21.056653 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85f8f6c798-w7c4t" event={"ID":"28e0dee6-4ad4-493f-971c-040de863f31f","Type":"ContainerStarted","Data":"43ef2428373774a3424d7effa6e78e16cee9049480ee15a69474c78eb03b09cf"} Feb 28 04:14:21 crc kubenswrapper[5072]: I0228 04:14:21.056813 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-85f8f6c798-w7c4t" Feb 28 04:14:21 crc kubenswrapper[5072]: I0228 04:14:21.082472 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=15.082434003 podStartE2EDuration="15.082434003s" podCreationTimestamp="2026-02-28 04:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:14:21.072477384 +0000 UTC m=+283.067207596" watchObservedRunningTime="2026-02-28 04:14:21.082434003 +0000 UTC m=+283.077164405" Feb 28 04:14:21 crc kubenswrapper[5072]: I0228 04:14:21.088613 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-95gbg" event={"ID":"109581ed-36ab-4625-bf7e-bcdecb30e35a","Type":"ContainerStarted","Data":"42d62b4252cc8be313b7d046ae9c8813a3c0fe094b08a1d2b055d13a3935ab17"} Feb 28 04:14:21 crc kubenswrapper[5072]: I0228 04:14:21.088791 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-66c49549b6-9qlzl" Feb 28 04:14:21 crc kubenswrapper[5072]: I0228 04:14:21.104008 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=21.103980482 podStartE2EDuration="21.103980482s" podCreationTimestamp="2026-02-28 04:14:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:14:21.101061322 +0000 UTC m=+283.095791514" watchObservedRunningTime="2026-02-28 04:14:21.103980482 +0000 UTC m=+283.098710674" Feb 28 04:14:21 crc kubenswrapper[5072]: I0228 04:14:21.127863 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-85f8f6c798-w7c4t" podStartSLOduration=21.127843924 podStartE2EDuration="21.127843924s" podCreationTimestamp="2026-02-28 04:14:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:14:21.126340647 +0000 UTC m=+283.121070849" watchObservedRunningTime="2026-02-28 04:14:21.127843924 +0000 UTC m=+283.122574116" Feb 28 04:14:21 crc kubenswrapper[5072]: I0228 04:14:21.148608 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-66c49549b6-9qlzl" podStartSLOduration=21.148583207 podStartE2EDuration="21.148583207s" podCreationTimestamp="2026-02-28 04:14:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:14:21.145829502 +0000 UTC m=+283.140559694" watchObservedRunningTime="2026-02-28 04:14:21.148583207 +0000 UTC m=+283.143313400" Feb 28 04:14:21 crc kubenswrapper[5072]: I0228 04:14:21.250542 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-85f8f6c798-w7c4t" Feb 28 04:14:21 crc kubenswrapper[5072]: I0228 04:14:21.313758 5072 ???:1] "http: TLS handshake error from 192.168.126.11:35664: no serving certificate available for the kubelet" Feb 28 04:14:22 crc kubenswrapper[5072]: I0228 04:14:22.105036 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-95gbg" event={"ID":"109581ed-36ab-4625-bf7e-bcdecb30e35a","Type":"ContainerStarted","Data":"8e818d40e2d29b1dbe4edb1f360414d850f4ad7748b4386362b7e968b27a47cb"} Feb 28 04:14:22 crc kubenswrapper[5072]: I0228 04:14:22.107945 5072 generic.go:334] "Generic (PLEG): container finished" podID="a496c146-b799-49dc-a8c7-9339af045983" containerID="10bc6648c6cb08865827329d87dc9ca69f71090c91ecefc616f2a003bcdae35e" exitCode=0 Feb 28 04:14:22 crc kubenswrapper[5072]: I0228 04:14:22.108043 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a496c146-b799-49dc-a8c7-9339af045983","Type":"ContainerDied","Data":"10bc6648c6cb08865827329d87dc9ca69f71090c91ecefc616f2a003bcdae35e"} Feb 28 04:14:22 crc kubenswrapper[5072]: I0228 04:14:22.119026 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-95gbg" podStartSLOduration=239.119003927 podStartE2EDuration="3m59.119003927s" podCreationTimestamp="2026-02-28 04:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:14:22.117337905 +0000 UTC m=+284.112068107" watchObservedRunningTime="2026-02-28 04:14:22.119003927 +0000 UTC m=+284.113734129" Feb 28 04:14:22 crc kubenswrapper[5072]: E0228 04:14:22.159808 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29537532-qwbxr" podUID="c5d29ab8-044b-4fc5-b5eb-02c5ac608dac" Feb 28 04:14:23 crc kubenswrapper[5072]: I0228 04:14:23.115450 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537534-5p5md" event={"ID":"a0ab1ca7-1f19-4416-9ebb-95ceffdc14ac","Type":"ContainerStarted","Data":"4399afb27e957fc7a9641764c0f29b17e57c2b8b1dd42a347c19d81079014e38"} Feb 28 04:14:23 crc kubenswrapper[5072]: I0228 04:14:23.117370 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" event={"ID":"a035bbab-1d8f-4120-aaf7-88984d936939","Type":"ContainerStarted","Data":"41cc17a140060c4cb6a238bd77382a9cc2f3dd6470af9dc5b7a487f87ffd0e35"} Feb 28 04:14:23 crc kubenswrapper[5072]: I0228 04:14:23.118907 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b4wcw" event={"ID":"9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23","Type":"ContainerStarted","Data":"75966a5daebdf8990ad131bd823054ca3a7762c3972cc5513b54acd8c8b82a17"} Feb 28 04:14:23 crc kubenswrapper[5072]: I0228 04:14:23.121440 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bkl6g" event={"ID":"9932ea0b-dd67-4ffb-a303-3ba7b97730ef","Type":"ContainerStarted","Data":"c9f2b9b521accab09159ba96b9529e3c5f988f9d28fca9872972809b01c10329"} Feb 28 04:14:23 crc kubenswrapper[5072]: I0228 04:14:23.123480 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7hzc" event={"ID":"11731378-2c2a-448a-918f-2e2f07619ee0","Type":"ContainerStarted","Data":"ce3eb6486bb03cfef7568bdc22e303afc43b8f21eec8b728356442639bec0163"} Feb 28 04:14:23 crc kubenswrapper[5072]: I0228 04:14:23.155148 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29537534-5p5md" podStartSLOduration=20.752684573 podStartE2EDuration="23.155128765s" podCreationTimestamp="2026-02-28 04:14:00 +0000 UTC" firstStartedPulling="2026-02-28 04:14:20.283405639 +0000 UTC m=+282.278135831" lastFinishedPulling="2026-02-28 04:14:22.685849821 +0000 UTC m=+284.680580023" observedRunningTime="2026-02-28 04:14:23.136196007 +0000 UTC m=+285.130926209" watchObservedRunningTime="2026-02-28 04:14:23.155128765 +0000 UTC m=+285.149858957" Feb 28 04:14:23 crc kubenswrapper[5072]: I0228 04:14:23.572673 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 28 04:14:23 crc kubenswrapper[5072]: I0228 04:14:23.574939 5072 csr.go:261] certificate signing request csr-fz9b8 is approved, waiting to be issued Feb 28 04:14:23 crc kubenswrapper[5072]: I0228 04:14:23.581878 5072 csr.go:257] certificate signing request csr-fz9b8 is issued Feb 28 04:14:23 crc kubenswrapper[5072]: I0228 04:14:23.627974 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a496c146-b799-49dc-a8c7-9339af045983-kube-api-access\") pod \"a496c146-b799-49dc-a8c7-9339af045983\" (UID: \"a496c146-b799-49dc-a8c7-9339af045983\") " Feb 28 04:14:23 crc kubenswrapper[5072]: I0228 04:14:23.628490 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a496c146-b799-49dc-a8c7-9339af045983-kubelet-dir\") pod \"a496c146-b799-49dc-a8c7-9339af045983\" (UID: \"a496c146-b799-49dc-a8c7-9339af045983\") " Feb 28 04:14:23 crc kubenswrapper[5072]: I0228 04:14:23.628815 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a496c146-b799-49dc-a8c7-9339af045983-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a496c146-b799-49dc-a8c7-9339af045983" (UID: "a496c146-b799-49dc-a8c7-9339af045983"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 04:14:23 crc kubenswrapper[5072]: I0228 04:14:23.637272 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a496c146-b799-49dc-a8c7-9339af045983-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a496c146-b799-49dc-a8c7-9339af045983" (UID: "a496c146-b799-49dc-a8c7-9339af045983"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:14:23 crc kubenswrapper[5072]: I0228 04:14:23.731502 5072 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a496c146-b799-49dc-a8c7-9339af045983-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:23 crc kubenswrapper[5072]: I0228 04:14:23.731702 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a496c146-b799-49dc-a8c7-9339af045983-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:24 crc kubenswrapper[5072]: I0228 04:14:24.132885 5072 generic.go:334] "Generic (PLEG): container finished" podID="11731378-2c2a-448a-918f-2e2f07619ee0" containerID="ce3eb6486bb03cfef7568bdc22e303afc43b8f21eec8b728356442639bec0163" exitCode=0 Feb 28 04:14:24 crc kubenswrapper[5072]: I0228 04:14:24.132995 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7hzc" event={"ID":"11731378-2c2a-448a-918f-2e2f07619ee0","Type":"ContainerDied","Data":"ce3eb6486bb03cfef7568bdc22e303afc43b8f21eec8b728356442639bec0163"} Feb 28 04:14:24 crc kubenswrapper[5072]: I0228 04:14:24.138131 5072 generic.go:334] "Generic (PLEG): container finished" podID="a0ab1ca7-1f19-4416-9ebb-95ceffdc14ac" containerID="4399afb27e957fc7a9641764c0f29b17e57c2b8b1dd42a347c19d81079014e38" exitCode=0 Feb 28 04:14:24 crc kubenswrapper[5072]: I0228 04:14:24.138696 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537534-5p5md" event={"ID":"a0ab1ca7-1f19-4416-9ebb-95ceffdc14ac","Type":"ContainerDied","Data":"4399afb27e957fc7a9641764c0f29b17e57c2b8b1dd42a347c19d81079014e38"} Feb 28 04:14:24 crc kubenswrapper[5072]: I0228 04:14:24.146500 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a496c146-b799-49dc-a8c7-9339af045983","Type":"ContainerDied","Data":"21d36fef69e7f1b549524dae488dd19a8a05d0f2bc3e943b1e3c14e9c0ba139f"} Feb 28 04:14:24 crc kubenswrapper[5072]: I0228 04:14:24.146580 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21d36fef69e7f1b549524dae488dd19a8a05d0f2bc3e943b1e3c14e9c0ba139f" Feb 28 04:14:24 crc kubenswrapper[5072]: I0228 04:14:24.146633 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 28 04:14:24 crc kubenswrapper[5072]: I0228 04:14:24.156408 5072 generic.go:334] "Generic (PLEG): container finished" podID="9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23" containerID="75966a5daebdf8990ad131bd823054ca3a7762c3972cc5513b54acd8c8b82a17" exitCode=0 Feb 28 04:14:24 crc kubenswrapper[5072]: I0228 04:14:24.156543 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b4wcw" event={"ID":"9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23","Type":"ContainerDied","Data":"75966a5daebdf8990ad131bd823054ca3a7762c3972cc5513b54acd8c8b82a17"} Feb 28 04:14:24 crc kubenswrapper[5072]: I0228 04:14:24.164575 5072 generic.go:334] "Generic (PLEG): container finished" podID="9932ea0b-dd67-4ffb-a303-3ba7b97730ef" containerID="c9f2b9b521accab09159ba96b9529e3c5f988f9d28fca9872972809b01c10329" exitCode=0 Feb 28 04:14:24 crc kubenswrapper[5072]: I0228 04:14:24.164682 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bkl6g" event={"ID":"9932ea0b-dd67-4ffb-a303-3ba7b97730ef","Type":"ContainerDied","Data":"c9f2b9b521accab09159ba96b9529e3c5f988f9d28fca9872972809b01c10329"} Feb 28 04:14:24 crc kubenswrapper[5072]: I0228 04:14:24.582688 5072 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-23 03:03:27.699247711 +0000 UTC Feb 28 04:14:24 crc kubenswrapper[5072]: I0228 04:14:24.583119 5072 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7150h49m3.116134018s for next certificate rotation Feb 28 04:14:25 crc kubenswrapper[5072]: I0228 04:14:25.193474 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7hzc" event={"ID":"11731378-2c2a-448a-918f-2e2f07619ee0","Type":"ContainerStarted","Data":"845cde7cd0fbca2f3eca107d6005bf76ca9e5e0f4078734f3731f5b0c99a6f2b"} Feb 28 04:14:25 crc kubenswrapper[5072]: I0228 04:14:25.217931 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g7hzc" podStartSLOduration=2.96126569 podStartE2EDuration="1m2.217906378s" podCreationTimestamp="2026-02-28 04:13:23 +0000 UTC" firstStartedPulling="2026-02-28 04:13:25.375810938 +0000 UTC m=+227.370541130" lastFinishedPulling="2026-02-28 04:14:24.632451626 +0000 UTC m=+286.627181818" observedRunningTime="2026-02-28 04:14:25.213504001 +0000 UTC m=+287.208234203" watchObservedRunningTime="2026-02-28 04:14:25.217906378 +0000 UTC m=+287.212636570" Feb 28 04:14:25 crc kubenswrapper[5072]: I0228 04:14:25.504539 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537534-5p5md" Feb 28 04:14:25 crc kubenswrapper[5072]: I0228 04:14:25.555961 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84wzl\" (UniqueName: \"kubernetes.io/projected/a0ab1ca7-1f19-4416-9ebb-95ceffdc14ac-kube-api-access-84wzl\") pod \"a0ab1ca7-1f19-4416-9ebb-95ceffdc14ac\" (UID: \"a0ab1ca7-1f19-4416-9ebb-95ceffdc14ac\") " Feb 28 04:14:25 crc kubenswrapper[5072]: I0228 04:14:25.563714 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0ab1ca7-1f19-4416-9ebb-95ceffdc14ac-kube-api-access-84wzl" (OuterVolumeSpecName: "kube-api-access-84wzl") pod "a0ab1ca7-1f19-4416-9ebb-95ceffdc14ac" (UID: "a0ab1ca7-1f19-4416-9ebb-95ceffdc14ac"). InnerVolumeSpecName "kube-api-access-84wzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:14:25 crc kubenswrapper[5072]: I0228 04:14:25.583939 5072 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-14 11:19:48.77000757 +0000 UTC Feb 28 04:14:25 crc kubenswrapper[5072]: I0228 04:14:25.583983 5072 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7687h5m23.186028343s for next certificate rotation Feb 28 04:14:25 crc kubenswrapper[5072]: I0228 04:14:25.657803 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84wzl\" (UniqueName: \"kubernetes.io/projected/a0ab1ca7-1f19-4416-9ebb-95ceffdc14ac-kube-api-access-84wzl\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:26 crc kubenswrapper[5072]: I0228 04:14:26.200874 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bkl6g" event={"ID":"9932ea0b-dd67-4ffb-a303-3ba7b97730ef","Type":"ContainerStarted","Data":"30e3b6c5947e06710b1cc001bf438af2acc2f7ffb87e8ed76d0f8bf924d6f3b6"} Feb 28 04:14:26 crc kubenswrapper[5072]: I0228 04:14:26.204004 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537534-5p5md" Feb 28 04:14:26 crc kubenswrapper[5072]: I0228 04:14:26.204001 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537534-5p5md" event={"ID":"a0ab1ca7-1f19-4416-9ebb-95ceffdc14ac","Type":"ContainerDied","Data":"09c32bc8294103f39e7710c728979897ef91790aac8ab5cb38c48c9fd2da68d7"} Feb 28 04:14:26 crc kubenswrapper[5072]: I0228 04:14:26.204070 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09c32bc8294103f39e7710c728979897ef91790aac8ab5cb38c48c9fd2da68d7" Feb 28 04:14:26 crc kubenswrapper[5072]: I0228 04:14:26.207666 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b4wcw" event={"ID":"9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23","Type":"ContainerStarted","Data":"04cbbd3883faa859c59f9f753cd5425652dad12d80aab0b6f8cf4f52a7c03ae0"} Feb 28 04:14:26 crc kubenswrapper[5072]: I0228 04:14:26.225860 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bkl6g" podStartSLOduration=4.627416479 podStartE2EDuration="1m3.225840111s" podCreationTimestamp="2026-02-28 04:13:23 +0000 UTC" firstStartedPulling="2026-02-28 04:13:26.394621373 +0000 UTC m=+228.389351565" lastFinishedPulling="2026-02-28 04:14:24.993045005 +0000 UTC m=+286.987775197" observedRunningTime="2026-02-28 04:14:26.224059576 +0000 UTC m=+288.218789768" watchObservedRunningTime="2026-02-28 04:14:26.225840111 +0000 UTC m=+288.220570293" Feb 28 04:14:26 crc kubenswrapper[5072]: I0228 04:14:26.250744 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b4wcw" podStartSLOduration=3.495043229 podStartE2EDuration="1m3.250722584s" podCreationTimestamp="2026-02-28 04:13:23 +0000 UTC" firstStartedPulling="2026-02-28 04:13:25.288975613 +0000 UTC m=+227.283705805" lastFinishedPulling="2026-02-28 04:14:25.044654968 +0000 UTC m=+287.039385160" observedRunningTime="2026-02-28 04:14:26.248609228 +0000 UTC m=+288.243339420" watchObservedRunningTime="2026-02-28 04:14:26.250722584 +0000 UTC m=+288.245452776" Feb 28 04:14:27 crc kubenswrapper[5072]: I0228 04:14:27.219513 5072 generic.go:334] "Generic (PLEG): container finished" podID="96c8a41b-5700-46e9-bea3-aac12066069f" containerID="0b8f60188c82bbea7be85756ace9699b478c23ce20eddc988094f38ae4ea7e7f" exitCode=0 Feb 28 04:14:27 crc kubenswrapper[5072]: I0228 04:14:27.219550 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gw7ld" event={"ID":"96c8a41b-5700-46e9-bea3-aac12066069f","Type":"ContainerDied","Data":"0b8f60188c82bbea7be85756ace9699b478c23ce20eddc988094f38ae4ea7e7f"} Feb 28 04:14:28 crc kubenswrapper[5072]: I0228 04:14:28.230290 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gw7ld" event={"ID":"96c8a41b-5700-46e9-bea3-aac12066069f","Type":"ContainerStarted","Data":"e70e85cf32797a7de5c019358d0170dcd3592259184110912a9a8cc2c58c746d"} Feb 28 04:14:28 crc kubenswrapper[5072]: I0228 04:14:28.247343 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gw7ld" podStartSLOduration=2.057155839 podStartE2EDuration="1m3.247315032s" podCreationTimestamp="2026-02-28 04:13:25 +0000 UTC" firstStartedPulling="2026-02-28 04:13:26.426124861 +0000 UTC m=+228.420855053" lastFinishedPulling="2026-02-28 04:14:27.616284054 +0000 UTC m=+289.611014246" observedRunningTime="2026-02-28 04:14:28.245267467 +0000 UTC m=+290.239997659" watchObservedRunningTime="2026-02-28 04:14:28.247315032 +0000 UTC m=+290.242045214" Feb 28 04:14:29 crc kubenswrapper[5072]: I0228 04:14:29.239793 5072 generic.go:334] "Generic (PLEG): container finished" podID="a0f264e9-eb98-4462-8a17-0dc4071f6b96" containerID="ba78c46299e964b6863eda4b9d00f6a71604ec61924214baf6f09d2d13c40503" exitCode=0 Feb 28 04:14:29 crc kubenswrapper[5072]: I0228 04:14:29.239882 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cpbd5" event={"ID":"a0f264e9-eb98-4462-8a17-0dc4071f6b96","Type":"ContainerDied","Data":"ba78c46299e964b6863eda4b9d00f6a71604ec61924214baf6f09d2d13c40503"} Feb 28 04:14:30 crc kubenswrapper[5072]: I0228 04:14:30.255471 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cpbd5" event={"ID":"a0f264e9-eb98-4462-8a17-0dc4071f6b96","Type":"ContainerStarted","Data":"3ed31b110f5ed5fb1341c895cbe0ca9cea3ef1e2896d1d99cc0e83b9f62ee130"} Feb 28 04:14:30 crc kubenswrapper[5072]: I0228 04:14:30.280853 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cpbd5" podStartSLOduration=2.690037068 podStartE2EDuration="1m7.280828445s" podCreationTimestamp="2026-02-28 04:13:23 +0000 UTC" firstStartedPulling="2026-02-28 04:13:25.29886292 +0000 UTC m=+227.293593112" lastFinishedPulling="2026-02-28 04:14:29.889654297 +0000 UTC m=+291.884384489" observedRunningTime="2026-02-28 04:14:30.279310039 +0000 UTC m=+292.274040231" watchObservedRunningTime="2026-02-28 04:14:30.280828445 +0000 UTC m=+292.275558637" Feb 28 04:14:32 crc kubenswrapper[5072]: I0228 04:14:32.274655 5072 generic.go:334] "Generic (PLEG): container finished" podID="1917348a-ad88-41e1-a1f1-215706769c5e" containerID="0995935c903e462fe40e3b2368962a37e9f1287ae17171565981b36a32a9e7d8" exitCode=0 Feb 28 04:14:32 crc kubenswrapper[5072]: I0228 04:14:32.275361 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7wqp" event={"ID":"1917348a-ad88-41e1-a1f1-215706769c5e","Type":"ContainerDied","Data":"0995935c903e462fe40e3b2368962a37e9f1287ae17171565981b36a32a9e7d8"} Feb 28 04:14:32 crc kubenswrapper[5072]: I0228 04:14:32.277910 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-67nm2" event={"ID":"c68fedb7-ccc2-4f59-8b91-7a59776ccd1d","Type":"ContainerStarted","Data":"7880cb2d594d95e1f17a0f55863b2d1f93416c42e9d181aabc4da55e10e37751"} Feb 28 04:14:33 crc kubenswrapper[5072]: I0228 04:14:33.285513 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7wqp" event={"ID":"1917348a-ad88-41e1-a1f1-215706769c5e","Type":"ContainerStarted","Data":"db9f69f151a25718b107d6a8e2897300614c13d220f75e16a42f9b96b7a58345"} Feb 28 04:14:33 crc kubenswrapper[5072]: I0228 04:14:33.288140 5072 generic.go:334] "Generic (PLEG): container finished" podID="c68fedb7-ccc2-4f59-8b91-7a59776ccd1d" containerID="7880cb2d594d95e1f17a0f55863b2d1f93416c42e9d181aabc4da55e10e37751" exitCode=0 Feb 28 04:14:33 crc kubenswrapper[5072]: I0228 04:14:33.288181 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-67nm2" event={"ID":"c68fedb7-ccc2-4f59-8b91-7a59776ccd1d","Type":"ContainerDied","Data":"7880cb2d594d95e1f17a0f55863b2d1f93416c42e9d181aabc4da55e10e37751"} Feb 28 04:14:33 crc kubenswrapper[5072]: I0228 04:14:33.307123 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g7wqp" podStartSLOduration=2.781806956 podStartE2EDuration="1m8.307104641s" podCreationTimestamp="2026-02-28 04:13:25 +0000 UTC" firstStartedPulling="2026-02-28 04:13:27.494984779 +0000 UTC m=+229.489714961" lastFinishedPulling="2026-02-28 04:14:33.020282454 +0000 UTC m=+295.015012646" observedRunningTime="2026-02-28 04:14:33.306159833 +0000 UTC m=+295.300890025" watchObservedRunningTime="2026-02-28 04:14:33.307104641 +0000 UTC m=+295.301834833" Feb 28 04:14:33 crc kubenswrapper[5072]: I0228 04:14:33.746551 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g7hzc" Feb 28 04:14:33 crc kubenswrapper[5072]: I0228 04:14:33.747152 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g7hzc" Feb 28 04:14:33 crc kubenswrapper[5072]: I0228 04:14:33.871065 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b4wcw" Feb 28 04:14:33 crc kubenswrapper[5072]: I0228 04:14:33.871321 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b4wcw" Feb 28 04:14:34 crc kubenswrapper[5072]: I0228 04:14:34.041392 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cpbd5" Feb 28 04:14:34 crc kubenswrapper[5072]: I0228 04:14:34.041459 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cpbd5" Feb 28 04:14:34 crc kubenswrapper[5072]: I0228 04:14:34.135414 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cpbd5" Feb 28 04:14:34 crc kubenswrapper[5072]: I0228 04:14:34.136527 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b4wcw" Feb 28 04:14:34 crc kubenswrapper[5072]: I0228 04:14:34.138221 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g7hzc" Feb 28 04:14:34 crc kubenswrapper[5072]: I0228 04:14:34.240544 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bkl6g" Feb 28 04:14:34 crc kubenswrapper[5072]: I0228 04:14:34.240749 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bkl6g" Feb 28 04:14:34 crc kubenswrapper[5072]: I0228 04:14:34.288544 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bkl6g" Feb 28 04:14:34 crc kubenswrapper[5072]: I0228 04:14:34.298551 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-67nm2" event={"ID":"c68fedb7-ccc2-4f59-8b91-7a59776ccd1d","Type":"ContainerStarted","Data":"b61639a058f6c52f053e4a06c2159e9af6a403b10bfcf0b20f4d908e6ceeca55"} Feb 28 04:14:34 crc kubenswrapper[5072]: I0228 04:14:34.350971 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g7hzc" Feb 28 04:14:34 crc kubenswrapper[5072]: I0228 04:14:34.352113 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bkl6g" Feb 28 04:14:34 crc kubenswrapper[5072]: I0228 04:14:34.352223 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b4wcw" Feb 28 04:14:35 crc kubenswrapper[5072]: I0228 04:14:35.326225 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-67nm2" podStartSLOduration=4.185715638 podStartE2EDuration="1m9.326201968s" podCreationTimestamp="2026-02-28 04:13:26 +0000 UTC" firstStartedPulling="2026-02-28 04:13:28.545299962 +0000 UTC m=+230.540030154" lastFinishedPulling="2026-02-28 04:14:33.685786292 +0000 UTC m=+295.680516484" observedRunningTime="2026-02-28 04:14:35.32306082 +0000 UTC m=+297.317791012" watchObservedRunningTime="2026-02-28 04:14:35.326201968 +0000 UTC m=+297.320932150" Feb 28 04:14:35 crc kubenswrapper[5072]: I0228 04:14:35.707983 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gw7ld" Feb 28 04:14:35 crc kubenswrapper[5072]: I0228 04:14:35.708055 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gw7ld" Feb 28 04:14:35 crc kubenswrapper[5072]: I0228 04:14:35.752213 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gw7ld" Feb 28 04:14:36 crc kubenswrapper[5072]: I0228 04:14:36.055498 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g7wqp" Feb 28 04:14:36 crc kubenswrapper[5072]: I0228 04:14:36.055597 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g7wqp" Feb 28 04:14:36 crc kubenswrapper[5072]: I0228 04:14:36.105177 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g7wqp" Feb 28 04:14:36 crc kubenswrapper[5072]: I0228 04:14:36.360167 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gw7ld" Feb 28 04:14:36 crc kubenswrapper[5072]: I0228 04:14:36.899325 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-67nm2" Feb 28 04:14:36 crc kubenswrapper[5072]: I0228 04:14:36.899768 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-67nm2" Feb 28 04:14:37 crc kubenswrapper[5072]: I0228 04:14:37.320112 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2k7bm" event={"ID":"cd61caed-be31-4706-9677-0da76d2cb2e7","Type":"ContainerStarted","Data":"0cb110bff899e1ca2268b4aee91ce3dece122434183403cb0281180cd782b372"} Feb 28 04:14:37 crc kubenswrapper[5072]: I0228 04:14:37.937747 5072 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-67nm2" podUID="c68fedb7-ccc2-4f59-8b91-7a59776ccd1d" containerName="registry-server" probeResult="failure" output=< Feb 28 04:14:37 crc kubenswrapper[5072]: timeout: failed to connect service ":50051" within 1s Feb 28 04:14:37 crc kubenswrapper[5072]: > Feb 28 04:14:38 crc kubenswrapper[5072]: I0228 04:14:38.186162 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bkl6g"] Feb 28 04:14:38 crc kubenswrapper[5072]: I0228 04:14:38.186872 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bkl6g" podUID="9932ea0b-dd67-4ffb-a303-3ba7b97730ef" containerName="registry-server" containerID="cri-o://30e3b6c5947e06710b1cc001bf438af2acc2f7ffb87e8ed76d0f8bf924d6f3b6" gracePeriod=2 Feb 28 04:14:38 crc kubenswrapper[5072]: I0228 04:14:38.327480 5072 generic.go:334] "Generic (PLEG): container finished" podID="cd61caed-be31-4706-9677-0da76d2cb2e7" containerID="0cb110bff899e1ca2268b4aee91ce3dece122434183403cb0281180cd782b372" exitCode=0 Feb 28 04:14:38 crc kubenswrapper[5072]: I0228 04:14:38.327529 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2k7bm" event={"ID":"cd61caed-be31-4706-9677-0da76d2cb2e7","Type":"ContainerDied","Data":"0cb110bff899e1ca2268b4aee91ce3dece122434183403cb0281180cd782b372"} Feb 28 04:14:39 crc kubenswrapper[5072]: I0228 04:14:39.335462 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537532-qwbxr" event={"ID":"c5d29ab8-044b-4fc5-b5eb-02c5ac608dac","Type":"ContainerStarted","Data":"95d0fc9d8c611beafb142746f4af121ed67a7672b0fe8f9de762c642787f81f9"} Feb 28 04:14:39 crc kubenswrapper[5072]: I0228 04:14:39.337949 5072 generic.go:334] "Generic (PLEG): container finished" podID="9932ea0b-dd67-4ffb-a303-3ba7b97730ef" containerID="30e3b6c5947e06710b1cc001bf438af2acc2f7ffb87e8ed76d0f8bf924d6f3b6" exitCode=0 Feb 28 04:14:39 crc kubenswrapper[5072]: I0228 04:14:39.338002 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bkl6g" event={"ID":"9932ea0b-dd67-4ffb-a303-3ba7b97730ef","Type":"ContainerDied","Data":"30e3b6c5947e06710b1cc001bf438af2acc2f7ffb87e8ed76d0f8bf924d6f3b6"} Feb 28 04:14:39 crc kubenswrapper[5072]: I0228 04:14:39.405409 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bkl6g" Feb 28 04:14:39 crc kubenswrapper[5072]: I0228 04:14:39.421200 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhh42\" (UniqueName: \"kubernetes.io/projected/9932ea0b-dd67-4ffb-a303-3ba7b97730ef-kube-api-access-rhh42\") pod \"9932ea0b-dd67-4ffb-a303-3ba7b97730ef\" (UID: \"9932ea0b-dd67-4ffb-a303-3ba7b97730ef\") " Feb 28 04:14:39 crc kubenswrapper[5072]: I0228 04:14:39.421347 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9932ea0b-dd67-4ffb-a303-3ba7b97730ef-catalog-content\") pod \"9932ea0b-dd67-4ffb-a303-3ba7b97730ef\" (UID: \"9932ea0b-dd67-4ffb-a303-3ba7b97730ef\") " Feb 28 04:14:39 crc kubenswrapper[5072]: I0228 04:14:39.421455 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9932ea0b-dd67-4ffb-a303-3ba7b97730ef-utilities\") pod \"9932ea0b-dd67-4ffb-a303-3ba7b97730ef\" (UID: \"9932ea0b-dd67-4ffb-a303-3ba7b97730ef\") " Feb 28 04:14:39 crc kubenswrapper[5072]: I0228 04:14:39.422444 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9932ea0b-dd67-4ffb-a303-3ba7b97730ef-utilities" (OuterVolumeSpecName: "utilities") pod "9932ea0b-dd67-4ffb-a303-3ba7b97730ef" (UID: "9932ea0b-dd67-4ffb-a303-3ba7b97730ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:14:39 crc kubenswrapper[5072]: I0228 04:14:39.429879 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9932ea0b-dd67-4ffb-a303-3ba7b97730ef-kube-api-access-rhh42" (OuterVolumeSpecName: "kube-api-access-rhh42") pod "9932ea0b-dd67-4ffb-a303-3ba7b97730ef" (UID: "9932ea0b-dd67-4ffb-a303-3ba7b97730ef"). InnerVolumeSpecName "kube-api-access-rhh42". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:14:39 crc kubenswrapper[5072]: I0228 04:14:39.429928 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29537532-qwbxr" podStartSLOduration=79.238411727 podStartE2EDuration="2m39.429904875s" podCreationTimestamp="2026-02-28 04:12:00 +0000 UTC" firstStartedPulling="2026-02-28 04:13:18.591488186 +0000 UTC m=+220.586218378" lastFinishedPulling="2026-02-28 04:14:38.782981334 +0000 UTC m=+300.777711526" observedRunningTime="2026-02-28 04:14:39.362628146 +0000 UTC m=+301.357358348" watchObservedRunningTime="2026-02-28 04:14:39.429904875 +0000 UTC m=+301.424635077" Feb 28 04:14:39 crc kubenswrapper[5072]: I0228 04:14:39.520830 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9932ea0b-dd67-4ffb-a303-3ba7b97730ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9932ea0b-dd67-4ffb-a303-3ba7b97730ef" (UID: "9932ea0b-dd67-4ffb-a303-3ba7b97730ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:14:39 crc kubenswrapper[5072]: I0228 04:14:39.522958 5072 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9932ea0b-dd67-4ffb-a303-3ba7b97730ef-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:39 crc kubenswrapper[5072]: I0228 04:14:39.523018 5072 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9932ea0b-dd67-4ffb-a303-3ba7b97730ef-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:39 crc kubenswrapper[5072]: I0228 04:14:39.523075 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhh42\" (UniqueName: \"kubernetes.io/projected/9932ea0b-dd67-4ffb-a303-3ba7b97730ef-kube-api-access-rhh42\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:40 crc kubenswrapper[5072]: I0228 04:14:40.345704 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bkl6g" Feb 28 04:14:40 crc kubenswrapper[5072]: I0228 04:14:40.349879 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bkl6g" event={"ID":"9932ea0b-dd67-4ffb-a303-3ba7b97730ef","Type":"ContainerDied","Data":"422f9715aab56b28df89145d4595997b85136a303fdc202f7d0aa3f548d170d5"} Feb 28 04:14:40 crc kubenswrapper[5072]: I0228 04:14:40.349998 5072 scope.go:117] "RemoveContainer" containerID="30e3b6c5947e06710b1cc001bf438af2acc2f7ffb87e8ed76d0f8bf924d6f3b6" Feb 28 04:14:40 crc kubenswrapper[5072]: I0228 04:14:40.351806 5072 generic.go:334] "Generic (PLEG): container finished" podID="c5d29ab8-044b-4fc5-b5eb-02c5ac608dac" containerID="95d0fc9d8c611beafb142746f4af121ed67a7672b0fe8f9de762c642787f81f9" exitCode=0 Feb 28 04:14:40 crc kubenswrapper[5072]: I0228 04:14:40.351840 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537532-qwbxr" event={"ID":"c5d29ab8-044b-4fc5-b5eb-02c5ac608dac","Type":"ContainerDied","Data":"95d0fc9d8c611beafb142746f4af121ed67a7672b0fe8f9de762c642787f81f9"} Feb 28 04:14:40 crc kubenswrapper[5072]: I0228 04:14:40.399028 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bkl6g"] Feb 28 04:14:40 crc kubenswrapper[5072]: I0228 04:14:40.408793 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bkl6g"] Feb 28 04:14:40 crc kubenswrapper[5072]: I0228 04:14:40.498807 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66c49549b6-9qlzl"] Feb 28 04:14:40 crc kubenswrapper[5072]: I0228 04:14:40.499110 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-66c49549b6-9qlzl" podUID="0cba0320-0aa0-4b0b-91bc-528003822f83" containerName="controller-manager" containerID="cri-o://8666bac49b8a01a0108c6c57e8cae369620fba8f9177596a43e844ba35a07b41" gracePeriod=30 Feb 28 04:14:40 crc kubenswrapper[5072]: I0228 04:14:40.575358 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85f8f6c798-w7c4t"] Feb 28 04:14:40 crc kubenswrapper[5072]: I0228 04:14:40.575602 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-85f8f6c798-w7c4t" podUID="28e0dee6-4ad4-493f-971c-040de863f31f" containerName="route-controller-manager" containerID="cri-o://f3e88a3e73e05d70ece61a038b1e2aa563238c668a81d161808f04bdcf6e775b" gracePeriod=30 Feb 28 04:14:40 crc kubenswrapper[5072]: I0228 04:14:40.667185 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9932ea0b-dd67-4ffb-a303-3ba7b97730ef" path="/var/lib/kubelet/pods/9932ea0b-dd67-4ffb-a303-3ba7b97730ef/volumes" Feb 28 04:14:40 crc kubenswrapper[5072]: I0228 04:14:40.950158 5072 scope.go:117] "RemoveContainer" containerID="c9f2b9b521accab09159ba96b9529e3c5f988f9d28fca9872972809b01c10329" Feb 28 04:14:41 crc kubenswrapper[5072]: I0228 04:14:41.359242 5072 generic.go:334] "Generic (PLEG): container finished" podID="0cba0320-0aa0-4b0b-91bc-528003822f83" containerID="8666bac49b8a01a0108c6c57e8cae369620fba8f9177596a43e844ba35a07b41" exitCode=0 Feb 28 04:14:41 crc kubenswrapper[5072]: I0228 04:14:41.359362 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66c49549b6-9qlzl" event={"ID":"0cba0320-0aa0-4b0b-91bc-528003822f83","Type":"ContainerDied","Data":"8666bac49b8a01a0108c6c57e8cae369620fba8f9177596a43e844ba35a07b41"} Feb 28 04:14:41 crc kubenswrapper[5072]: I0228 04:14:41.361101 5072 generic.go:334] "Generic (PLEG): container finished" podID="28e0dee6-4ad4-493f-971c-040de863f31f" containerID="f3e88a3e73e05d70ece61a038b1e2aa563238c668a81d161808f04bdcf6e775b" exitCode=0 Feb 28 04:14:41 crc kubenswrapper[5072]: I0228 04:14:41.361224 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85f8f6c798-w7c4t" event={"ID":"28e0dee6-4ad4-493f-971c-040de863f31f","Type":"ContainerDied","Data":"f3e88a3e73e05d70ece61a038b1e2aa563238c668a81d161808f04bdcf6e775b"} Feb 28 04:14:41 crc kubenswrapper[5072]: I0228 04:14:41.928914 5072 scope.go:117] "RemoveContainer" containerID="8cc0adc3e6f35342eccfa44b64faa6fe88f6f6a8adc3268fea50554b0f5a9de0" Feb 28 04:14:41 crc kubenswrapper[5072]: I0228 04:14:41.983752 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537532-qwbxr" Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.058096 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmxjs\" (UniqueName: \"kubernetes.io/projected/c5d29ab8-044b-4fc5-b5eb-02c5ac608dac-kube-api-access-kmxjs\") pod \"c5d29ab8-044b-4fc5-b5eb-02c5ac608dac\" (UID: \"c5d29ab8-044b-4fc5-b5eb-02c5ac608dac\") " Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.067626 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5d29ab8-044b-4fc5-b5eb-02c5ac608dac-kube-api-access-kmxjs" (OuterVolumeSpecName: "kube-api-access-kmxjs") pod "c5d29ab8-044b-4fc5-b5eb-02c5ac608dac" (UID: "c5d29ab8-044b-4fc5-b5eb-02c5ac608dac"). InnerVolumeSpecName "kube-api-access-kmxjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.159912 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmxjs\" (UniqueName: \"kubernetes.io/projected/c5d29ab8-044b-4fc5-b5eb-02c5ac608dac-kube-api-access-kmxjs\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.164667 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85f8f6c798-w7c4t" Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.241823 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66c49549b6-9qlzl" Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.261945 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0cba0320-0aa0-4b0b-91bc-528003822f83-client-ca\") pod \"0cba0320-0aa0-4b0b-91bc-528003822f83\" (UID: \"0cba0320-0aa0-4b0b-91bc-528003822f83\") " Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.262949 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5twz9\" (UniqueName: \"kubernetes.io/projected/0cba0320-0aa0-4b0b-91bc-528003822f83-kube-api-access-5twz9\") pod \"0cba0320-0aa0-4b0b-91bc-528003822f83\" (UID: \"0cba0320-0aa0-4b0b-91bc-528003822f83\") " Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.262889 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cba0320-0aa0-4b0b-91bc-528003822f83-client-ca" (OuterVolumeSpecName: "client-ca") pod "0cba0320-0aa0-4b0b-91bc-528003822f83" (UID: "0cba0320-0aa0-4b0b-91bc-528003822f83"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.263030 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28e0dee6-4ad4-493f-971c-040de863f31f-client-ca\") pod \"28e0dee6-4ad4-493f-971c-040de863f31f\" (UID: \"28e0dee6-4ad4-493f-971c-040de863f31f\") " Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.271014 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cba0320-0aa0-4b0b-91bc-528003822f83-kube-api-access-5twz9" (OuterVolumeSpecName: "kube-api-access-5twz9") pod "0cba0320-0aa0-4b0b-91bc-528003822f83" (UID: "0cba0320-0aa0-4b0b-91bc-528003822f83"). InnerVolumeSpecName "kube-api-access-5twz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.272379 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cba0320-0aa0-4b0b-91bc-528003822f83-serving-cert\") pod \"0cba0320-0aa0-4b0b-91bc-528003822f83\" (UID: \"0cba0320-0aa0-4b0b-91bc-528003822f83\") " Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.272464 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0cba0320-0aa0-4b0b-91bc-528003822f83-proxy-ca-bundles\") pod \"0cba0320-0aa0-4b0b-91bc-528003822f83\" (UID: \"0cba0320-0aa0-4b0b-91bc-528003822f83\") " Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.272543 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cba0320-0aa0-4b0b-91bc-528003822f83-config\") pod \"0cba0320-0aa0-4b0b-91bc-528003822f83\" (UID: \"0cba0320-0aa0-4b0b-91bc-528003822f83\") " Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.272580 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28e0dee6-4ad4-493f-971c-040de863f31f-config\") pod \"28e0dee6-4ad4-493f-971c-040de863f31f\" (UID: \"28e0dee6-4ad4-493f-971c-040de863f31f\") " Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.273614 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-459cd\" (UniqueName: \"kubernetes.io/projected/28e0dee6-4ad4-493f-971c-040de863f31f-kube-api-access-459cd\") pod \"28e0dee6-4ad4-493f-971c-040de863f31f\" (UID: \"28e0dee6-4ad4-493f-971c-040de863f31f\") " Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.269712 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28e0dee6-4ad4-493f-971c-040de863f31f-client-ca" (OuterVolumeSpecName: "client-ca") pod "28e0dee6-4ad4-493f-971c-040de863f31f" (UID: "28e0dee6-4ad4-493f-971c-040de863f31f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.273738 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28e0dee6-4ad4-493f-971c-040de863f31f-serving-cert\") pod \"28e0dee6-4ad4-493f-971c-040de863f31f\" (UID: \"28e0dee6-4ad4-493f-971c-040de863f31f\") " Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.274719 5072 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0cba0320-0aa0-4b0b-91bc-528003822f83-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.274747 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5twz9\" (UniqueName: \"kubernetes.io/projected/0cba0320-0aa0-4b0b-91bc-528003822f83-kube-api-access-5twz9\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.274768 5072 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28e0dee6-4ad4-493f-971c-040de863f31f-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.275884 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cba0320-0aa0-4b0b-91bc-528003822f83-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0cba0320-0aa0-4b0b-91bc-528003822f83" (UID: "0cba0320-0aa0-4b0b-91bc-528003822f83"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.276014 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cba0320-0aa0-4b0b-91bc-528003822f83-config" (OuterVolumeSpecName: "config") pod "0cba0320-0aa0-4b0b-91bc-528003822f83" (UID: "0cba0320-0aa0-4b0b-91bc-528003822f83"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.277203 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28e0dee6-4ad4-493f-971c-040de863f31f-config" (OuterVolumeSpecName: "config") pod "28e0dee6-4ad4-493f-971c-040de863f31f" (UID: "28e0dee6-4ad4-493f-971c-040de863f31f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.277295 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28e0dee6-4ad4-493f-971c-040de863f31f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "28e0dee6-4ad4-493f-971c-040de863f31f" (UID: "28e0dee6-4ad4-493f-971c-040de863f31f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.277715 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cba0320-0aa0-4b0b-91bc-528003822f83-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0cba0320-0aa0-4b0b-91bc-528003822f83" (UID: "0cba0320-0aa0-4b0b-91bc-528003822f83"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.286309 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28e0dee6-4ad4-493f-971c-040de863f31f-kube-api-access-459cd" (OuterVolumeSpecName: "kube-api-access-459cd") pod "28e0dee6-4ad4-493f-971c-040de863f31f" (UID: "28e0dee6-4ad4-493f-971c-040de863f31f"). InnerVolumeSpecName "kube-api-access-459cd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.371899 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537532-qwbxr" event={"ID":"c5d29ab8-044b-4fc5-b5eb-02c5ac608dac","Type":"ContainerDied","Data":"96d62b60438b0c5446e965bc0dc000febacf642171abf1c16bf9b8c8c15657b7"} Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.371960 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96d62b60438b0c5446e965bc0dc000febacf642171abf1c16bf9b8c8c15657b7" Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.372437 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537532-qwbxr" Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.376613 5072 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0cba0320-0aa0-4b0b-91bc-528003822f83-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.376735 5072 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28e0dee6-4ad4-493f-971c-040de863f31f-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.376766 5072 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cba0320-0aa0-4b0b-91bc-528003822f83-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.376794 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-459cd\" (UniqueName: \"kubernetes.io/projected/28e0dee6-4ad4-493f-971c-040de863f31f-kube-api-access-459cd\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.376822 5072 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28e0dee6-4ad4-493f-971c-040de863f31f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.376846 5072 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cba0320-0aa0-4b0b-91bc-528003822f83-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.377169 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66c49549b6-9qlzl" event={"ID":"0cba0320-0aa0-4b0b-91bc-528003822f83","Type":"ContainerDied","Data":"80bc84ba445e5088fce0e917cfe38c86a625c0ab2598eff6312dbde2bc9ad7e2"} Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.377238 5072 scope.go:117] "RemoveContainer" containerID="8666bac49b8a01a0108c6c57e8cae369620fba8f9177596a43e844ba35a07b41" Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.377480 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66c49549b6-9qlzl" Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.394289 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85f8f6c798-w7c4t" event={"ID":"28e0dee6-4ad4-493f-971c-040de863f31f","Type":"ContainerDied","Data":"43ef2428373774a3424d7effa6e78e16cee9049480ee15a69474c78eb03b09cf"} Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.394370 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85f8f6c798-w7c4t" Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.420392 5072 scope.go:117] "RemoveContainer" containerID="f3e88a3e73e05d70ece61a038b1e2aa563238c668a81d161808f04bdcf6e775b" Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.429689 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85f8f6c798-w7c4t"] Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.432463 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85f8f6c798-w7c4t"] Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.450957 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66c49549b6-9qlzl"] Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.456182 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-66c49549b6-9qlzl"] Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.666729 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cba0320-0aa0-4b0b-91bc-528003822f83" path="/var/lib/kubelet/pods/0cba0320-0aa0-4b0b-91bc-528003822f83/volumes" Feb 28 04:14:42 crc kubenswrapper[5072]: I0228 04:14:42.667337 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28e0dee6-4ad4-493f-971c-040de863f31f" path="/var/lib/kubelet/pods/28e0dee6-4ad4-493f-971c-040de863f31f/volumes" Feb 28 04:14:43 crc kubenswrapper[5072]: I0228 04:14:43.405578 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2k7bm" event={"ID":"cd61caed-be31-4706-9677-0da76d2cb2e7","Type":"ContainerStarted","Data":"d3500e3f3c43259aab388a05ec5e92724790a92f6e20d4d807bcb99f0b407d0b"} Feb 28 04:14:43 crc kubenswrapper[5072]: I0228 04:14:43.427952 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2k7bm" podStartSLOduration=4.056900177 podStartE2EDuration="1m17.427928421s" podCreationTimestamp="2026-02-28 04:13:26 +0000 UTC" firstStartedPulling="2026-02-28 04:13:28.557855591 +0000 UTC m=+230.552585783" lastFinishedPulling="2026-02-28 04:14:41.928883825 +0000 UTC m=+303.923614027" observedRunningTime="2026-02-28 04:14:43.423739761 +0000 UTC m=+305.418469953" watchObservedRunningTime="2026-02-28 04:14:43.427928421 +0000 UTC m=+305.422658613" Feb 28 04:14:44 crc kubenswrapper[5072]: I0228 04:14:44.078199 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cpbd5" Feb 28 04:14:44 crc kubenswrapper[5072]: I0228 04:14:44.984003 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-78ff6764c-7x6gx"] Feb 28 04:14:44 crc kubenswrapper[5072]: E0228 04:14:44.984323 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9932ea0b-dd67-4ffb-a303-3ba7b97730ef" containerName="extract-utilities" Feb 28 04:14:44 crc kubenswrapper[5072]: I0228 04:14:44.984338 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="9932ea0b-dd67-4ffb-a303-3ba7b97730ef" containerName="extract-utilities" Feb 28 04:14:44 crc kubenswrapper[5072]: E0228 04:14:44.984350 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9932ea0b-dd67-4ffb-a303-3ba7b97730ef" containerName="registry-server" Feb 28 04:14:44 crc kubenswrapper[5072]: I0228 04:14:44.984356 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="9932ea0b-dd67-4ffb-a303-3ba7b97730ef" containerName="registry-server" Feb 28 04:14:44 crc kubenswrapper[5072]: E0228 04:14:44.984369 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0ab1ca7-1f19-4416-9ebb-95ceffdc14ac" containerName="oc" Feb 28 04:14:44 crc kubenswrapper[5072]: I0228 04:14:44.984375 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0ab1ca7-1f19-4416-9ebb-95ceffdc14ac" containerName="oc" Feb 28 04:14:44 crc kubenswrapper[5072]: E0228 04:14:44.984385 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cba0320-0aa0-4b0b-91bc-528003822f83" containerName="controller-manager" Feb 28 04:14:44 crc kubenswrapper[5072]: I0228 04:14:44.984393 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cba0320-0aa0-4b0b-91bc-528003822f83" containerName="controller-manager" Feb 28 04:14:44 crc kubenswrapper[5072]: E0228 04:14:44.984399 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9932ea0b-dd67-4ffb-a303-3ba7b97730ef" containerName="extract-content" Feb 28 04:14:44 crc kubenswrapper[5072]: I0228 04:14:44.984407 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="9932ea0b-dd67-4ffb-a303-3ba7b97730ef" containerName="extract-content" Feb 28 04:14:44 crc kubenswrapper[5072]: E0228 04:14:44.984418 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5d29ab8-044b-4fc5-b5eb-02c5ac608dac" containerName="oc" Feb 28 04:14:44 crc kubenswrapper[5072]: I0228 04:14:44.984423 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5d29ab8-044b-4fc5-b5eb-02c5ac608dac" containerName="oc" Feb 28 04:14:44 crc kubenswrapper[5072]: E0228 04:14:44.984432 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a496c146-b799-49dc-a8c7-9339af045983" containerName="pruner" Feb 28 04:14:44 crc kubenswrapper[5072]: I0228 04:14:44.984437 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="a496c146-b799-49dc-a8c7-9339af045983" containerName="pruner" Feb 28 04:14:44 crc kubenswrapper[5072]: E0228 04:14:44.984454 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28e0dee6-4ad4-493f-971c-040de863f31f" containerName="route-controller-manager" Feb 28 04:14:44 crc kubenswrapper[5072]: I0228 04:14:44.984459 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="28e0dee6-4ad4-493f-971c-040de863f31f" containerName="route-controller-manager" Feb 28 04:14:44 crc kubenswrapper[5072]: I0228 04:14:44.984557 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0ab1ca7-1f19-4416-9ebb-95ceffdc14ac" containerName="oc" Feb 28 04:14:44 crc kubenswrapper[5072]: I0228 04:14:44.984571 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="9932ea0b-dd67-4ffb-a303-3ba7b97730ef" containerName="registry-server" Feb 28 04:14:44 crc kubenswrapper[5072]: I0228 04:14:44.984581 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="a496c146-b799-49dc-a8c7-9339af045983" containerName="pruner" Feb 28 04:14:44 crc kubenswrapper[5072]: I0228 04:14:44.984592 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5d29ab8-044b-4fc5-b5eb-02c5ac608dac" containerName="oc" Feb 28 04:14:44 crc kubenswrapper[5072]: I0228 04:14:44.984600 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="28e0dee6-4ad4-493f-971c-040de863f31f" containerName="route-controller-manager" Feb 28 04:14:44 crc kubenswrapper[5072]: I0228 04:14:44.984609 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cba0320-0aa0-4b0b-91bc-528003822f83" containerName="controller-manager" Feb 28 04:14:44 crc kubenswrapper[5072]: I0228 04:14:44.985104 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78ff6764c-7x6gx" Feb 28 04:14:44 crc kubenswrapper[5072]: I0228 04:14:44.987988 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d8d94fbf-wtrhr"] Feb 28 04:14:44 crc kubenswrapper[5072]: I0228 04:14:44.988889 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d8d94fbf-wtrhr" Feb 28 04:14:44 crc kubenswrapper[5072]: I0228 04:14:44.990822 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 28 04:14:44 crc kubenswrapper[5072]: I0228 04:14:44.991285 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 28 04:14:44 crc kubenswrapper[5072]: I0228 04:14:44.994365 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 28 04:14:44 crc kubenswrapper[5072]: I0228 04:14:44.994380 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 28 04:14:44 crc kubenswrapper[5072]: I0228 04:14:44.994659 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 28 04:14:44 crc kubenswrapper[5072]: I0228 04:14:44.994740 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 28 04:14:44 crc kubenswrapper[5072]: I0228 04:14:44.994803 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 28 04:14:44 crc kubenswrapper[5072]: I0228 04:14:44.994806 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 28 04:14:44 crc kubenswrapper[5072]: I0228 04:14:44.994983 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 28 04:14:44 crc kubenswrapper[5072]: I0228 04:14:44.995002 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 28 04:14:44 crc kubenswrapper[5072]: I0228 04:14:44.995088 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 28 04:14:44 crc kubenswrapper[5072]: I0228 04:14:44.995320 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.008941 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c5ebae06-49d0-4c58-b08b-5e93fd627b09-proxy-ca-bundles\") pod \"controller-manager-78ff6764c-7x6gx\" (UID: \"c5ebae06-49d0-4c58-b08b-5e93fd627b09\") " pod="openshift-controller-manager/controller-manager-78ff6764c-7x6gx" Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.009276 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5ebae06-49d0-4c58-b08b-5e93fd627b09-serving-cert\") pod \"controller-manager-78ff6764c-7x6gx\" (UID: \"c5ebae06-49d0-4c58-b08b-5e93fd627b09\") " pod="openshift-controller-manager/controller-manager-78ff6764c-7x6gx" Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.009425 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbeb79c8-992a-461c-b02f-a58f5aaa31f4-serving-cert\") pod \"route-controller-manager-d8d94fbf-wtrhr\" (UID: \"cbeb79c8-992a-461c-b02f-a58f5aaa31f4\") " pod="openshift-route-controller-manager/route-controller-manager-d8d94fbf-wtrhr" Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.009532 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cbeb79c8-992a-461c-b02f-a58f5aaa31f4-client-ca\") pod \"route-controller-manager-d8d94fbf-wtrhr\" (UID: \"cbeb79c8-992a-461c-b02f-a58f5aaa31f4\") " pod="openshift-route-controller-manager/route-controller-manager-d8d94fbf-wtrhr" Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.009668 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbeb79c8-992a-461c-b02f-a58f5aaa31f4-config\") pod \"route-controller-manager-d8d94fbf-wtrhr\" (UID: \"cbeb79c8-992a-461c-b02f-a58f5aaa31f4\") " pod="openshift-route-controller-manager/route-controller-manager-d8d94fbf-wtrhr" Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.009810 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8c9q\" (UniqueName: \"kubernetes.io/projected/cbeb79c8-992a-461c-b02f-a58f5aaa31f4-kube-api-access-d8c9q\") pod \"route-controller-manager-d8d94fbf-wtrhr\" (UID: \"cbeb79c8-992a-461c-b02f-a58f5aaa31f4\") " pod="openshift-route-controller-manager/route-controller-manager-d8d94fbf-wtrhr" Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.009944 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5ebae06-49d0-4c58-b08b-5e93fd627b09-config\") pod \"controller-manager-78ff6764c-7x6gx\" (UID: \"c5ebae06-49d0-4c58-b08b-5e93fd627b09\") " pod="openshift-controller-manager/controller-manager-78ff6764c-7x6gx" Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.010599 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t8l9\" (UniqueName: \"kubernetes.io/projected/c5ebae06-49d0-4c58-b08b-5e93fd627b09-kube-api-access-9t8l9\") pod \"controller-manager-78ff6764c-7x6gx\" (UID: \"c5ebae06-49d0-4c58-b08b-5e93fd627b09\") " pod="openshift-controller-manager/controller-manager-78ff6764c-7x6gx" Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.010768 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5ebae06-49d0-4c58-b08b-5e93fd627b09-client-ca\") pod \"controller-manager-78ff6764c-7x6gx\" (UID: \"c5ebae06-49d0-4c58-b08b-5e93fd627b09\") " pod="openshift-controller-manager/controller-manager-78ff6764c-7x6gx" Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.011573 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78ff6764c-7x6gx"] Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.016360 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.033997 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d8d94fbf-wtrhr"] Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.112001 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5ebae06-49d0-4c58-b08b-5e93fd627b09-serving-cert\") pod \"controller-manager-78ff6764c-7x6gx\" (UID: \"c5ebae06-49d0-4c58-b08b-5e93fd627b09\") " pod="openshift-controller-manager/controller-manager-78ff6764c-7x6gx" Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.112063 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbeb79c8-992a-461c-b02f-a58f5aaa31f4-serving-cert\") pod \"route-controller-manager-d8d94fbf-wtrhr\" (UID: \"cbeb79c8-992a-461c-b02f-a58f5aaa31f4\") " pod="openshift-route-controller-manager/route-controller-manager-d8d94fbf-wtrhr" Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.112082 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cbeb79c8-992a-461c-b02f-a58f5aaa31f4-client-ca\") pod \"route-controller-manager-d8d94fbf-wtrhr\" (UID: \"cbeb79c8-992a-461c-b02f-a58f5aaa31f4\") " pod="openshift-route-controller-manager/route-controller-manager-d8d94fbf-wtrhr" Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.112109 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbeb79c8-992a-461c-b02f-a58f5aaa31f4-config\") pod \"route-controller-manager-d8d94fbf-wtrhr\" (UID: \"cbeb79c8-992a-461c-b02f-a58f5aaa31f4\") " pod="openshift-route-controller-manager/route-controller-manager-d8d94fbf-wtrhr" Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.112136 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8c9q\" (UniqueName: \"kubernetes.io/projected/cbeb79c8-992a-461c-b02f-a58f5aaa31f4-kube-api-access-d8c9q\") pod \"route-controller-manager-d8d94fbf-wtrhr\" (UID: \"cbeb79c8-992a-461c-b02f-a58f5aaa31f4\") " pod="openshift-route-controller-manager/route-controller-manager-d8d94fbf-wtrhr" Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.112168 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5ebae06-49d0-4c58-b08b-5e93fd627b09-config\") pod \"controller-manager-78ff6764c-7x6gx\" (UID: \"c5ebae06-49d0-4c58-b08b-5e93fd627b09\") " pod="openshift-controller-manager/controller-manager-78ff6764c-7x6gx" Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.112197 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t8l9\" (UniqueName: \"kubernetes.io/projected/c5ebae06-49d0-4c58-b08b-5e93fd627b09-kube-api-access-9t8l9\") pod \"controller-manager-78ff6764c-7x6gx\" (UID: \"c5ebae06-49d0-4c58-b08b-5e93fd627b09\") " pod="openshift-controller-manager/controller-manager-78ff6764c-7x6gx" Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.112218 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5ebae06-49d0-4c58-b08b-5e93fd627b09-client-ca\") pod \"controller-manager-78ff6764c-7x6gx\" (UID: \"c5ebae06-49d0-4c58-b08b-5e93fd627b09\") " pod="openshift-controller-manager/controller-manager-78ff6764c-7x6gx" Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.112243 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c5ebae06-49d0-4c58-b08b-5e93fd627b09-proxy-ca-bundles\") pod \"controller-manager-78ff6764c-7x6gx\" (UID: \"c5ebae06-49d0-4c58-b08b-5e93fd627b09\") " pod="openshift-controller-manager/controller-manager-78ff6764c-7x6gx" Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.113173 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cbeb79c8-992a-461c-b02f-a58f5aaa31f4-client-ca\") pod \"route-controller-manager-d8d94fbf-wtrhr\" (UID: \"cbeb79c8-992a-461c-b02f-a58f5aaa31f4\") " pod="openshift-route-controller-manager/route-controller-manager-d8d94fbf-wtrhr" Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.113286 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c5ebae06-49d0-4c58-b08b-5e93fd627b09-proxy-ca-bundles\") pod \"controller-manager-78ff6764c-7x6gx\" (UID: \"c5ebae06-49d0-4c58-b08b-5e93fd627b09\") " pod="openshift-controller-manager/controller-manager-78ff6764c-7x6gx" Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.113501 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbeb79c8-992a-461c-b02f-a58f5aaa31f4-config\") pod \"route-controller-manager-d8d94fbf-wtrhr\" (UID: \"cbeb79c8-992a-461c-b02f-a58f5aaa31f4\") " pod="openshift-route-controller-manager/route-controller-manager-d8d94fbf-wtrhr" Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.113766 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5ebae06-49d0-4c58-b08b-5e93fd627b09-config\") pod \"controller-manager-78ff6764c-7x6gx\" (UID: \"c5ebae06-49d0-4c58-b08b-5e93fd627b09\") " pod="openshift-controller-manager/controller-manager-78ff6764c-7x6gx" Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.114076 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5ebae06-49d0-4c58-b08b-5e93fd627b09-client-ca\") pod \"controller-manager-78ff6764c-7x6gx\" (UID: \"c5ebae06-49d0-4c58-b08b-5e93fd627b09\") " pod="openshift-controller-manager/controller-manager-78ff6764c-7x6gx" Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.123325 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbeb79c8-992a-461c-b02f-a58f5aaa31f4-serving-cert\") pod \"route-controller-manager-d8d94fbf-wtrhr\" (UID: \"cbeb79c8-992a-461c-b02f-a58f5aaa31f4\") " pod="openshift-route-controller-manager/route-controller-manager-d8d94fbf-wtrhr" Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.133410 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8c9q\" (UniqueName: \"kubernetes.io/projected/cbeb79c8-992a-461c-b02f-a58f5aaa31f4-kube-api-access-d8c9q\") pod \"route-controller-manager-d8d94fbf-wtrhr\" (UID: \"cbeb79c8-992a-461c-b02f-a58f5aaa31f4\") " pod="openshift-route-controller-manager/route-controller-manager-d8d94fbf-wtrhr" Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.137132 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5ebae06-49d0-4c58-b08b-5e93fd627b09-serving-cert\") pod \"controller-manager-78ff6764c-7x6gx\" (UID: \"c5ebae06-49d0-4c58-b08b-5e93fd627b09\") " pod="openshift-controller-manager/controller-manager-78ff6764c-7x6gx" Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.138075 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t8l9\" (UniqueName: \"kubernetes.io/projected/c5ebae06-49d0-4c58-b08b-5e93fd627b09-kube-api-access-9t8l9\") pod \"controller-manager-78ff6764c-7x6gx\" (UID: \"c5ebae06-49d0-4c58-b08b-5e93fd627b09\") " pod="openshift-controller-manager/controller-manager-78ff6764c-7x6gx" Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.319114 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78ff6764c-7x6gx" Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.340300 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d8d94fbf-wtrhr" Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.561225 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78ff6764c-7x6gx"] Feb 28 04:14:45 crc kubenswrapper[5072]: I0228 04:14:45.872927 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d8d94fbf-wtrhr"] Feb 28 04:14:45 crc kubenswrapper[5072]: W0228 04:14:45.878105 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbeb79c8_992a_461c_b02f_a58f5aaa31f4.slice/crio-c2e375b7fbac84ebe982e68569020470d2726511ba26039c486ec95178158f9b WatchSource:0}: Error finding container c2e375b7fbac84ebe982e68569020470d2726511ba26039c486ec95178158f9b: Status 404 returned error can't find the container with id c2e375b7fbac84ebe982e68569020470d2726511ba26039c486ec95178158f9b Feb 28 04:14:46 crc kubenswrapper[5072]: I0228 04:14:46.106667 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g7wqp" Feb 28 04:14:46 crc kubenswrapper[5072]: I0228 04:14:46.438017 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d8d94fbf-wtrhr" event={"ID":"cbeb79c8-992a-461c-b02f-a58f5aaa31f4","Type":"ContainerStarted","Data":"8795131432c16c78ba7b7af07154ca023720eefdd3be0674400f9b82fc9aa93d"} Feb 28 04:14:46 crc kubenswrapper[5072]: I0228 04:14:46.438083 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d8d94fbf-wtrhr" event={"ID":"cbeb79c8-992a-461c-b02f-a58f5aaa31f4","Type":"ContainerStarted","Data":"c2e375b7fbac84ebe982e68569020470d2726511ba26039c486ec95178158f9b"} Feb 28 04:14:46 crc kubenswrapper[5072]: I0228 04:14:46.438459 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-d8d94fbf-wtrhr" Feb 28 04:14:46 crc kubenswrapper[5072]: I0228 04:14:46.441012 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78ff6764c-7x6gx" event={"ID":"c5ebae06-49d0-4c58-b08b-5e93fd627b09","Type":"ContainerStarted","Data":"f128a3d4bdf0dacb8608a946cc835c24285d4dce07e464f4db315b09d4133386"} Feb 28 04:14:46 crc kubenswrapper[5072]: I0228 04:14:46.441052 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78ff6764c-7x6gx" event={"ID":"c5ebae06-49d0-4c58-b08b-5e93fd627b09","Type":"ContainerStarted","Data":"f38f994e566b408766e6046b9ebd557dde9d3e894eaa61c61c04723164627834"} Feb 28 04:14:46 crc kubenswrapper[5072]: I0228 04:14:46.441246 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-78ff6764c-7x6gx" Feb 28 04:14:46 crc kubenswrapper[5072]: I0228 04:14:46.446241 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-78ff6764c-7x6gx" Feb 28 04:14:46 crc kubenswrapper[5072]: I0228 04:14:46.448073 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-d8d94fbf-wtrhr" Feb 28 04:14:46 crc kubenswrapper[5072]: I0228 04:14:46.485466 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-d8d94fbf-wtrhr" podStartSLOduration=6.485444837 podStartE2EDuration="6.485444837s" podCreationTimestamp="2026-02-28 04:14:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:14:46.459243903 +0000 UTC m=+308.453974095" watchObservedRunningTime="2026-02-28 04:14:46.485444837 +0000 UTC m=+308.480175039" Feb 28 04:14:46 crc kubenswrapper[5072]: I0228 04:14:46.487195 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-78ff6764c-7x6gx" podStartSLOduration=6.487186951 podStartE2EDuration="6.487186951s" podCreationTimestamp="2026-02-28 04:14:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:14:46.482617899 +0000 UTC m=+308.477348101" watchObservedRunningTime="2026-02-28 04:14:46.487186951 +0000 UTC m=+308.481917153" Feb 28 04:14:46 crc kubenswrapper[5072]: I0228 04:14:46.587365 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cpbd5"] Feb 28 04:14:46 crc kubenswrapper[5072]: I0228 04:14:46.587701 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cpbd5" podUID="a0f264e9-eb98-4462-8a17-0dc4071f6b96" containerName="registry-server" containerID="cri-o://3ed31b110f5ed5fb1341c895cbe0ca9cea3ef1e2896d1d99cc0e83b9f62ee130" gracePeriod=2 Feb 28 04:14:46 crc kubenswrapper[5072]: I0228 04:14:46.787587 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g7wqp"] Feb 28 04:14:46 crc kubenswrapper[5072]: I0228 04:14:46.787964 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-g7wqp" podUID="1917348a-ad88-41e1-a1f1-215706769c5e" containerName="registry-server" containerID="cri-o://db9f69f151a25718b107d6a8e2897300614c13d220f75e16a42f9b96b7a58345" gracePeriod=2 Feb 28 04:14:46 crc kubenswrapper[5072]: I0228 04:14:46.948342 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-67nm2" Feb 28 04:14:46 crc kubenswrapper[5072]: I0228 04:14:46.998621 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-67nm2" Feb 28 04:14:47 crc kubenswrapper[5072]: I0228 04:14:47.295058 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2k7bm" Feb 28 04:14:47 crc kubenswrapper[5072]: I0228 04:14:47.295453 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2k7bm" Feb 28 04:14:47 crc kubenswrapper[5072]: I0228 04:14:47.448251 5072 generic.go:334] "Generic (PLEG): container finished" podID="a0f264e9-eb98-4462-8a17-0dc4071f6b96" containerID="3ed31b110f5ed5fb1341c895cbe0ca9cea3ef1e2896d1d99cc0e83b9f62ee130" exitCode=0 Feb 28 04:14:47 crc kubenswrapper[5072]: I0228 04:14:47.448435 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cpbd5" event={"ID":"a0f264e9-eb98-4462-8a17-0dc4071f6b96","Type":"ContainerDied","Data":"3ed31b110f5ed5fb1341c895cbe0ca9cea3ef1e2896d1d99cc0e83b9f62ee130"} Feb 28 04:14:47 crc kubenswrapper[5072]: I0228 04:14:47.892900 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cpbd5" Feb 28 04:14:47 crc kubenswrapper[5072]: I0228 04:14:47.954342 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8td77\" (UniqueName: \"kubernetes.io/projected/a0f264e9-eb98-4462-8a17-0dc4071f6b96-kube-api-access-8td77\") pod \"a0f264e9-eb98-4462-8a17-0dc4071f6b96\" (UID: \"a0f264e9-eb98-4462-8a17-0dc4071f6b96\") " Feb 28 04:14:47 crc kubenswrapper[5072]: I0228 04:14:47.954428 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0f264e9-eb98-4462-8a17-0dc4071f6b96-utilities\") pod \"a0f264e9-eb98-4462-8a17-0dc4071f6b96\" (UID: \"a0f264e9-eb98-4462-8a17-0dc4071f6b96\") " Feb 28 04:14:47 crc kubenswrapper[5072]: I0228 04:14:47.955669 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f264e9-eb98-4462-8a17-0dc4071f6b96-utilities" (OuterVolumeSpecName: "utilities") pod "a0f264e9-eb98-4462-8a17-0dc4071f6b96" (UID: "a0f264e9-eb98-4462-8a17-0dc4071f6b96"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:14:47 crc kubenswrapper[5072]: I0228 04:14:47.955898 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0f264e9-eb98-4462-8a17-0dc4071f6b96-catalog-content\") pod \"a0f264e9-eb98-4462-8a17-0dc4071f6b96\" (UID: \"a0f264e9-eb98-4462-8a17-0dc4071f6b96\") " Feb 28 04:14:47 crc kubenswrapper[5072]: I0228 04:14:47.956552 5072 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0f264e9-eb98-4462-8a17-0dc4071f6b96-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:47 crc kubenswrapper[5072]: I0228 04:14:47.972288 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0f264e9-eb98-4462-8a17-0dc4071f6b96-kube-api-access-8td77" (OuterVolumeSpecName: "kube-api-access-8td77") pod "a0f264e9-eb98-4462-8a17-0dc4071f6b96" (UID: "a0f264e9-eb98-4462-8a17-0dc4071f6b96"). InnerVolumeSpecName "kube-api-access-8td77". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.035744 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f264e9-eb98-4462-8a17-0dc4071f6b96-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a0f264e9-eb98-4462-8a17-0dc4071f6b96" (UID: "a0f264e9-eb98-4462-8a17-0dc4071f6b96"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.057861 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8td77\" (UniqueName: \"kubernetes.io/projected/a0f264e9-eb98-4462-8a17-0dc4071f6b96-kube-api-access-8td77\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.057917 5072 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0f264e9-eb98-4462-8a17-0dc4071f6b96-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.077980 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g7wqp" Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.158826 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1917348a-ad88-41e1-a1f1-215706769c5e-utilities\") pod \"1917348a-ad88-41e1-a1f1-215706769c5e\" (UID: \"1917348a-ad88-41e1-a1f1-215706769c5e\") " Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.159281 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1917348a-ad88-41e1-a1f1-215706769c5e-catalog-content\") pod \"1917348a-ad88-41e1-a1f1-215706769c5e\" (UID: \"1917348a-ad88-41e1-a1f1-215706769c5e\") " Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.159414 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phxxs\" (UniqueName: \"kubernetes.io/projected/1917348a-ad88-41e1-a1f1-215706769c5e-kube-api-access-phxxs\") pod \"1917348a-ad88-41e1-a1f1-215706769c5e\" (UID: \"1917348a-ad88-41e1-a1f1-215706769c5e\") " Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.159663 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1917348a-ad88-41e1-a1f1-215706769c5e-utilities" (OuterVolumeSpecName: "utilities") pod "1917348a-ad88-41e1-a1f1-215706769c5e" (UID: "1917348a-ad88-41e1-a1f1-215706769c5e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.159854 5072 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1917348a-ad88-41e1-a1f1-215706769c5e-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.163029 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1917348a-ad88-41e1-a1f1-215706769c5e-kube-api-access-phxxs" (OuterVolumeSpecName: "kube-api-access-phxxs") pod "1917348a-ad88-41e1-a1f1-215706769c5e" (UID: "1917348a-ad88-41e1-a1f1-215706769c5e"). InnerVolumeSpecName "kube-api-access-phxxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.189052 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1917348a-ad88-41e1-a1f1-215706769c5e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1917348a-ad88-41e1-a1f1-215706769c5e" (UID: "1917348a-ad88-41e1-a1f1-215706769c5e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.261709 5072 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1917348a-ad88-41e1-a1f1-215706769c5e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.261754 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phxxs\" (UniqueName: \"kubernetes.io/projected/1917348a-ad88-41e1-a1f1-215706769c5e-kube-api-access-phxxs\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.341203 5072 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2k7bm" podUID="cd61caed-be31-4706-9677-0da76d2cb2e7" containerName="registry-server" probeResult="failure" output=< Feb 28 04:14:48 crc kubenswrapper[5072]: timeout: failed to connect service ":50051" within 1s Feb 28 04:14:48 crc kubenswrapper[5072]: > Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.455884 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cpbd5" event={"ID":"a0f264e9-eb98-4462-8a17-0dc4071f6b96","Type":"ContainerDied","Data":"0250d0b5bb40464bf2f386c78c4a18882c23849a44adbe9a2116d217105f5f74"} Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.455949 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cpbd5" Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.455958 5072 scope.go:117] "RemoveContainer" containerID="3ed31b110f5ed5fb1341c895cbe0ca9cea3ef1e2896d1d99cc0e83b9f62ee130" Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.458178 5072 generic.go:334] "Generic (PLEG): container finished" podID="1917348a-ad88-41e1-a1f1-215706769c5e" containerID="db9f69f151a25718b107d6a8e2897300614c13d220f75e16a42f9b96b7a58345" exitCode=0 Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.458360 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7wqp" event={"ID":"1917348a-ad88-41e1-a1f1-215706769c5e","Type":"ContainerDied","Data":"db9f69f151a25718b107d6a8e2897300614c13d220f75e16a42f9b96b7a58345"} Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.458402 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7wqp" event={"ID":"1917348a-ad88-41e1-a1f1-215706769c5e","Type":"ContainerDied","Data":"4dbac56433ca8d5be7ed98724dcabf0fdb18b47ca3c8b480550ce9d1f6024534"} Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.458480 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g7wqp" Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.476032 5072 scope.go:117] "RemoveContainer" containerID="ba78c46299e964b6863eda4b9d00f6a71604ec61924214baf6f09d2d13c40503" Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.488512 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cpbd5"] Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.491734 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cpbd5"] Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.503044 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g7wqp"] Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.508837 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-g7wqp"] Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.513307 5072 scope.go:117] "RemoveContainer" containerID="68e993abad76a3fedafcb441fe67b8278bfdcf5f9df81f866b069142ce99a6e8" Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.528395 5072 scope.go:117] "RemoveContainer" containerID="db9f69f151a25718b107d6a8e2897300614c13d220f75e16a42f9b96b7a58345" Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.545568 5072 scope.go:117] "RemoveContainer" containerID="0995935c903e462fe40e3b2368962a37e9f1287ae17171565981b36a32a9e7d8" Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.576756 5072 scope.go:117] "RemoveContainer" containerID="be1479e106d2039416b8455ff0dbf2a6a833a9bbb3d0541d0b63b4edf5bc1bb5" Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.601806 5072 scope.go:117] "RemoveContainer" containerID="db9f69f151a25718b107d6a8e2897300614c13d220f75e16a42f9b96b7a58345" Feb 28 04:14:48 crc kubenswrapper[5072]: E0228 04:14:48.602349 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db9f69f151a25718b107d6a8e2897300614c13d220f75e16a42f9b96b7a58345\": container with ID starting with db9f69f151a25718b107d6a8e2897300614c13d220f75e16a42f9b96b7a58345 not found: ID does not exist" containerID="db9f69f151a25718b107d6a8e2897300614c13d220f75e16a42f9b96b7a58345" Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.602421 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db9f69f151a25718b107d6a8e2897300614c13d220f75e16a42f9b96b7a58345"} err="failed to get container status \"db9f69f151a25718b107d6a8e2897300614c13d220f75e16a42f9b96b7a58345\": rpc error: code = NotFound desc = could not find container \"db9f69f151a25718b107d6a8e2897300614c13d220f75e16a42f9b96b7a58345\": container with ID starting with db9f69f151a25718b107d6a8e2897300614c13d220f75e16a42f9b96b7a58345 not found: ID does not exist" Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.602477 5072 scope.go:117] "RemoveContainer" containerID="0995935c903e462fe40e3b2368962a37e9f1287ae17171565981b36a32a9e7d8" Feb 28 04:14:48 crc kubenswrapper[5072]: E0228 04:14:48.603112 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0995935c903e462fe40e3b2368962a37e9f1287ae17171565981b36a32a9e7d8\": container with ID starting with 0995935c903e462fe40e3b2368962a37e9f1287ae17171565981b36a32a9e7d8 not found: ID does not exist" containerID="0995935c903e462fe40e3b2368962a37e9f1287ae17171565981b36a32a9e7d8" Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.603160 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0995935c903e462fe40e3b2368962a37e9f1287ae17171565981b36a32a9e7d8"} err="failed to get container status \"0995935c903e462fe40e3b2368962a37e9f1287ae17171565981b36a32a9e7d8\": rpc error: code = NotFound desc = could not find container \"0995935c903e462fe40e3b2368962a37e9f1287ae17171565981b36a32a9e7d8\": container with ID starting with 0995935c903e462fe40e3b2368962a37e9f1287ae17171565981b36a32a9e7d8 not found: ID does not exist" Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.603196 5072 scope.go:117] "RemoveContainer" containerID="be1479e106d2039416b8455ff0dbf2a6a833a9bbb3d0541d0b63b4edf5bc1bb5" Feb 28 04:14:48 crc kubenswrapper[5072]: E0228 04:14:48.603566 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be1479e106d2039416b8455ff0dbf2a6a833a9bbb3d0541d0b63b4edf5bc1bb5\": container with ID starting with be1479e106d2039416b8455ff0dbf2a6a833a9bbb3d0541d0b63b4edf5bc1bb5 not found: ID does not exist" containerID="be1479e106d2039416b8455ff0dbf2a6a833a9bbb3d0541d0b63b4edf5bc1bb5" Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.603607 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be1479e106d2039416b8455ff0dbf2a6a833a9bbb3d0541d0b63b4edf5bc1bb5"} err="failed to get container status \"be1479e106d2039416b8455ff0dbf2a6a833a9bbb3d0541d0b63b4edf5bc1bb5\": rpc error: code = NotFound desc = could not find container \"be1479e106d2039416b8455ff0dbf2a6a833a9bbb3d0541d0b63b4edf5bc1bb5\": container with ID starting with be1479e106d2039416b8455ff0dbf2a6a833a9bbb3d0541d0b63b4edf5bc1bb5 not found: ID does not exist" Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.667277 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1917348a-ad88-41e1-a1f1-215706769c5e" path="/var/lib/kubelet/pods/1917348a-ad88-41e1-a1f1-215706769c5e/volumes" Feb 28 04:14:48 crc kubenswrapper[5072]: I0228 04:14:48.668130 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0f264e9-eb98-4462-8a17-0dc4071f6b96" path="/var/lib/kubelet/pods/a0f264e9-eb98-4462-8a17-0dc4071f6b96/volumes" Feb 28 04:14:55 crc kubenswrapper[5072]: I0228 04:14:55.004450 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4d9s6"] Feb 28 04:14:57 crc kubenswrapper[5072]: I0228 04:14:57.334622 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2k7bm" Feb 28 04:14:57 crc kubenswrapper[5072]: I0228 04:14:57.385175 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2k7bm" Feb 28 04:14:57 crc kubenswrapper[5072]: I0228 04:14:57.574214 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2k7bm"] Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.421228 5072 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 28 04:14:58 crc kubenswrapper[5072]: E0228 04:14:58.421913 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f264e9-eb98-4462-8a17-0dc4071f6b96" containerName="extract-utilities" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.421931 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f264e9-eb98-4462-8a17-0dc4071f6b96" containerName="extract-utilities" Feb 28 04:14:58 crc kubenswrapper[5072]: E0228 04:14:58.421947 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1917348a-ad88-41e1-a1f1-215706769c5e" containerName="extract-utilities" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.421953 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="1917348a-ad88-41e1-a1f1-215706769c5e" containerName="extract-utilities" Feb 28 04:14:58 crc kubenswrapper[5072]: E0228 04:14:58.421983 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1917348a-ad88-41e1-a1f1-215706769c5e" containerName="registry-server" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.421990 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="1917348a-ad88-41e1-a1f1-215706769c5e" containerName="registry-server" Feb 28 04:14:58 crc kubenswrapper[5072]: E0228 04:14:58.422002 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f264e9-eb98-4462-8a17-0dc4071f6b96" containerName="registry-server" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.422007 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f264e9-eb98-4462-8a17-0dc4071f6b96" containerName="registry-server" Feb 28 04:14:58 crc kubenswrapper[5072]: E0228 04:14:58.422016 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1917348a-ad88-41e1-a1f1-215706769c5e" containerName="extract-content" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.422023 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="1917348a-ad88-41e1-a1f1-215706769c5e" containerName="extract-content" Feb 28 04:14:58 crc kubenswrapper[5072]: E0228 04:14:58.422034 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f264e9-eb98-4462-8a17-0dc4071f6b96" containerName="extract-content" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.422061 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f264e9-eb98-4462-8a17-0dc4071f6b96" containerName="extract-content" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.422196 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0f264e9-eb98-4462-8a17-0dc4071f6b96" containerName="registry-server" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.422212 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="1917348a-ad88-41e1-a1f1-215706769c5e" containerName="registry-server" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.422580 5072 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.422788 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.422863 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50" gracePeriod=15 Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.422928 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4" gracePeriod=15 Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.422998 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392" gracePeriod=15 Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.423124 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb" gracePeriod=15 Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.423130 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998" gracePeriod=15 Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.425479 5072 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 28 04:14:58 crc kubenswrapper[5072]: E0228 04:14:58.425892 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.425912 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 28 04:14:58 crc kubenswrapper[5072]: E0228 04:14:58.425922 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.425932 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 04:14:58 crc kubenswrapper[5072]: E0228 04:14:58.425943 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.425952 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 04:14:58 crc kubenswrapper[5072]: E0228 04:14:58.425968 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.425977 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 04:14:58 crc kubenswrapper[5072]: E0228 04:14:58.425999 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.426008 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 28 04:14:58 crc kubenswrapper[5072]: E0228 04:14:58.426021 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.426030 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 04:14:58 crc kubenswrapper[5072]: E0228 04:14:58.426048 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.426058 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 28 04:14:58 crc kubenswrapper[5072]: E0228 04:14:58.426070 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.426079 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 28 04:14:58 crc kubenswrapper[5072]: E0228 04:14:58.426094 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.426104 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.426272 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.426292 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.426308 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.426320 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.426329 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.426345 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.426356 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 28 04:14:58 crc kubenswrapper[5072]: E0228 04:14:58.426482 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.426497 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.426696 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.426719 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.506362 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.506479 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.506535 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.506574 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.506608 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.506627 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.506690 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.506720 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.524982 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2k7bm" podUID="cd61caed-be31-4706-9677-0da76d2cb2e7" containerName="registry-server" containerID="cri-o://d3500e3f3c43259aab388a05ec5e92724790a92f6e20d4d807bcb99f0b407d0b" gracePeriod=2 Feb 28 04:14:58 crc kubenswrapper[5072]: E0228 04:14:58.526032 5072 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.5:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-operators-2k7bm.18984de4d1052b85 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-operators-2k7bm,UID:cd61caed-be31-4706-9677-0da76d2cb2e7,APIVersion:v1,ResourceVersion:28875,FieldPath:spec.containers{registry-server},},Reason:Killing,Message:Stopping container registry-server,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:14:58.524949381 +0000 UTC m=+320.519679613,LastTimestamp:2026-02-28 04:14:58.524949381 +0000 UTC m=+320.519679613,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.526320 5072 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.526875 5072 status_manager.go:851] "Failed to get status for pod" podUID="cd61caed-be31-4706-9677-0da76d2cb2e7" pod="openshift-marketplace/redhat-operators-2k7bm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2k7bm\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.606403 5072 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.606469 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.608041 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.608076 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.608111 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.608128 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.608158 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.608161 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.608184 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.608203 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.608232 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.608215 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.608247 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.608232 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.608246 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.608312 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.608353 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.608382 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.662561 5072 status_manager.go:851] "Failed to get status for pod" podUID="cd61caed-be31-4706-9677-0da76d2cb2e7" pod="openshift-marketplace/redhat-operators-2k7bm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2k7bm\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.662996 5072 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.993846 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2k7bm" Feb 28 04:14:58 crc kubenswrapper[5072]: I0228 04:14:58.995240 5072 status_manager.go:851] "Failed to get status for pod" podUID="cd61caed-be31-4706-9677-0da76d2cb2e7" pod="openshift-marketplace/redhat-operators-2k7bm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2k7bm\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.115433 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd61caed-be31-4706-9677-0da76d2cb2e7-catalog-content\") pod \"cd61caed-be31-4706-9677-0da76d2cb2e7\" (UID: \"cd61caed-be31-4706-9677-0da76d2cb2e7\") " Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.115570 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqcmq\" (UniqueName: \"kubernetes.io/projected/cd61caed-be31-4706-9677-0da76d2cb2e7-kube-api-access-pqcmq\") pod \"cd61caed-be31-4706-9677-0da76d2cb2e7\" (UID: \"cd61caed-be31-4706-9677-0da76d2cb2e7\") " Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.115700 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd61caed-be31-4706-9677-0da76d2cb2e7-utilities\") pod \"cd61caed-be31-4706-9677-0da76d2cb2e7\" (UID: \"cd61caed-be31-4706-9677-0da76d2cb2e7\") " Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.116697 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd61caed-be31-4706-9677-0da76d2cb2e7-utilities" (OuterVolumeSpecName: "utilities") pod "cd61caed-be31-4706-9677-0da76d2cb2e7" (UID: "cd61caed-be31-4706-9677-0da76d2cb2e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.121573 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd61caed-be31-4706-9677-0da76d2cb2e7-kube-api-access-pqcmq" (OuterVolumeSpecName: "kube-api-access-pqcmq") pod "cd61caed-be31-4706-9677-0da76d2cb2e7" (UID: "cd61caed-be31-4706-9677-0da76d2cb2e7"). InnerVolumeSpecName "kube-api-access-pqcmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.217474 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqcmq\" (UniqueName: \"kubernetes.io/projected/cd61caed-be31-4706-9677-0da76d2cb2e7-kube-api-access-pqcmq\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.217518 5072 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd61caed-be31-4706-9677-0da76d2cb2e7-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.266214 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd61caed-be31-4706-9677-0da76d2cb2e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd61caed-be31-4706-9677-0da76d2cb2e7" (UID: "cd61caed-be31-4706-9677-0da76d2cb2e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.319028 5072 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd61caed-be31-4706-9677-0da76d2cb2e7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.536377 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.538521 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.539521 5072 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4" exitCode=0 Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.539553 5072 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392" exitCode=0 Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.539563 5072 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb" exitCode=0 Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.539571 5072 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998" exitCode=2 Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.539608 5072 scope.go:117] "RemoveContainer" containerID="16594bf1a0a90653662487f001b7e934e6e571576fca5b44bbd340ce05b964d9" Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.542934 5072 generic.go:334] "Generic (PLEG): container finished" podID="e202e5af-1037-43ae-968e-0e594828048e" containerID="02231ede3d004e870f63d7e1d9de10eed2d61f06b210f1fbb725318b44d602b7" exitCode=0 Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.543022 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e202e5af-1037-43ae-968e-0e594828048e","Type":"ContainerDied","Data":"02231ede3d004e870f63d7e1d9de10eed2d61f06b210f1fbb725318b44d602b7"} Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.544326 5072 status_manager.go:851] "Failed to get status for pod" podUID="e202e5af-1037-43ae-968e-0e594828048e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.544773 5072 status_manager.go:851] "Failed to get status for pod" podUID="cd61caed-be31-4706-9677-0da76d2cb2e7" pod="openshift-marketplace/redhat-operators-2k7bm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2k7bm\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.546603 5072 generic.go:334] "Generic (PLEG): container finished" podID="cd61caed-be31-4706-9677-0da76d2cb2e7" containerID="d3500e3f3c43259aab388a05ec5e92724790a92f6e20d4d807bcb99f0b407d0b" exitCode=0 Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.546635 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2k7bm" event={"ID":"cd61caed-be31-4706-9677-0da76d2cb2e7","Type":"ContainerDied","Data":"d3500e3f3c43259aab388a05ec5e92724790a92f6e20d4d807bcb99f0b407d0b"} Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.546666 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2k7bm" event={"ID":"cd61caed-be31-4706-9677-0da76d2cb2e7","Type":"ContainerDied","Data":"9aff94aad5af4de13bbf65e72a8ce74b1cde16f71835d008bee91127c94811fc"} Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.546714 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2k7bm" Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.547454 5072 status_manager.go:851] "Failed to get status for pod" podUID="e202e5af-1037-43ae-968e-0e594828048e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.547845 5072 status_manager.go:851] "Failed to get status for pod" podUID="cd61caed-be31-4706-9677-0da76d2cb2e7" pod="openshift-marketplace/redhat-operators-2k7bm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2k7bm\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.572310 5072 status_manager.go:851] "Failed to get status for pod" podUID="e202e5af-1037-43ae-968e-0e594828048e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.573230 5072 status_manager.go:851] "Failed to get status for pod" podUID="cd61caed-be31-4706-9677-0da76d2cb2e7" pod="openshift-marketplace/redhat-operators-2k7bm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2k7bm\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.592123 5072 scope.go:117] "RemoveContainer" containerID="d3500e3f3c43259aab388a05ec5e92724790a92f6e20d4d807bcb99f0b407d0b" Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.611538 5072 scope.go:117] "RemoveContainer" containerID="0cb110bff899e1ca2268b4aee91ce3dece122434183403cb0281180cd782b372" Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.628220 5072 scope.go:117] "RemoveContainer" containerID="b11dc963a97650c629e35b1fa0e4a2599cdccebe30a1e15b2472d2a32ee1dd36" Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.646415 5072 scope.go:117] "RemoveContainer" containerID="d3500e3f3c43259aab388a05ec5e92724790a92f6e20d4d807bcb99f0b407d0b" Feb 28 04:14:59 crc kubenswrapper[5072]: E0228 04:14:59.647327 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3500e3f3c43259aab388a05ec5e92724790a92f6e20d4d807bcb99f0b407d0b\": container with ID starting with d3500e3f3c43259aab388a05ec5e92724790a92f6e20d4d807bcb99f0b407d0b not found: ID does not exist" containerID="d3500e3f3c43259aab388a05ec5e92724790a92f6e20d4d807bcb99f0b407d0b" Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.647364 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3500e3f3c43259aab388a05ec5e92724790a92f6e20d4d807bcb99f0b407d0b"} err="failed to get container status \"d3500e3f3c43259aab388a05ec5e92724790a92f6e20d4d807bcb99f0b407d0b\": rpc error: code = NotFound desc = could not find container \"d3500e3f3c43259aab388a05ec5e92724790a92f6e20d4d807bcb99f0b407d0b\": container with ID starting with d3500e3f3c43259aab388a05ec5e92724790a92f6e20d4d807bcb99f0b407d0b not found: ID does not exist" Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.647388 5072 scope.go:117] "RemoveContainer" containerID="0cb110bff899e1ca2268b4aee91ce3dece122434183403cb0281180cd782b372" Feb 28 04:14:59 crc kubenswrapper[5072]: E0228 04:14:59.647859 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cb110bff899e1ca2268b4aee91ce3dece122434183403cb0281180cd782b372\": container with ID starting with 0cb110bff899e1ca2268b4aee91ce3dece122434183403cb0281180cd782b372 not found: ID does not exist" containerID="0cb110bff899e1ca2268b4aee91ce3dece122434183403cb0281180cd782b372" Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.647882 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cb110bff899e1ca2268b4aee91ce3dece122434183403cb0281180cd782b372"} err="failed to get container status \"0cb110bff899e1ca2268b4aee91ce3dece122434183403cb0281180cd782b372\": rpc error: code = NotFound desc = could not find container \"0cb110bff899e1ca2268b4aee91ce3dece122434183403cb0281180cd782b372\": container with ID starting with 0cb110bff899e1ca2268b4aee91ce3dece122434183403cb0281180cd782b372 not found: ID does not exist" Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.647898 5072 scope.go:117] "RemoveContainer" containerID="b11dc963a97650c629e35b1fa0e4a2599cdccebe30a1e15b2472d2a32ee1dd36" Feb 28 04:14:59 crc kubenswrapper[5072]: E0228 04:14:59.648306 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b11dc963a97650c629e35b1fa0e4a2599cdccebe30a1e15b2472d2a32ee1dd36\": container with ID starting with b11dc963a97650c629e35b1fa0e4a2599cdccebe30a1e15b2472d2a32ee1dd36 not found: ID does not exist" containerID="b11dc963a97650c629e35b1fa0e4a2599cdccebe30a1e15b2472d2a32ee1dd36" Feb 28 04:14:59 crc kubenswrapper[5072]: I0228 04:14:59.648327 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b11dc963a97650c629e35b1fa0e4a2599cdccebe30a1e15b2472d2a32ee1dd36"} err="failed to get container status \"b11dc963a97650c629e35b1fa0e4a2599cdccebe30a1e15b2472d2a32ee1dd36\": rpc error: code = NotFound desc = could not find container \"b11dc963a97650c629e35b1fa0e4a2599cdccebe30a1e15b2472d2a32ee1dd36\": container with ID starting with b11dc963a97650c629e35b1fa0e4a2599cdccebe30a1e15b2472d2a32ee1dd36 not found: ID does not exist" Feb 28 04:15:00 crc kubenswrapper[5072]: I0228 04:15:00.567817 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 28 04:15:00 crc kubenswrapper[5072]: I0228 04:15:00.907743 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 28 04:15:00 crc kubenswrapper[5072]: I0228 04:15:00.908752 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:15:00 crc kubenswrapper[5072]: I0228 04:15:00.909683 5072 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:00 crc kubenswrapper[5072]: I0228 04:15:00.910070 5072 status_manager.go:851] "Failed to get status for pod" podUID="e202e5af-1037-43ae-968e-0e594828048e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:00 crc kubenswrapper[5072]: I0228 04:15:00.910708 5072 status_manager.go:851] "Failed to get status for pod" podUID="cd61caed-be31-4706-9677-0da76d2cb2e7" pod="openshift-marketplace/redhat-operators-2k7bm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2k7bm\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:00 crc kubenswrapper[5072]: I0228 04:15:00.942368 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 28 04:15:00 crc kubenswrapper[5072]: I0228 04:15:00.942468 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 28 04:15:00 crc kubenswrapper[5072]: I0228 04:15:00.942507 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 28 04:15:00 crc kubenswrapper[5072]: I0228 04:15:00.942533 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 04:15:00 crc kubenswrapper[5072]: I0228 04:15:00.942541 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 04:15:00 crc kubenswrapper[5072]: I0228 04:15:00.942606 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 04:15:00 crc kubenswrapper[5072]: I0228 04:15:00.942786 5072 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 28 04:15:00 crc kubenswrapper[5072]: I0228 04:15:00.942799 5072 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 28 04:15:00 crc kubenswrapper[5072]: I0228 04:15:00.942808 5072 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 28 04:15:00 crc kubenswrapper[5072]: I0228 04:15:00.978708 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 28 04:15:00 crc kubenswrapper[5072]: I0228 04:15:00.979287 5072 status_manager.go:851] "Failed to get status for pod" podUID="e202e5af-1037-43ae-968e-0e594828048e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:00 crc kubenswrapper[5072]: I0228 04:15:00.979858 5072 status_manager.go:851] "Failed to get status for pod" podUID="cd61caed-be31-4706-9677-0da76d2cb2e7" pod="openshift-marketplace/redhat-operators-2k7bm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2k7bm\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:00 crc kubenswrapper[5072]: I0228 04:15:00.980509 5072 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.043349 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e202e5af-1037-43ae-968e-0e594828048e-var-lock\") pod \"e202e5af-1037-43ae-968e-0e594828048e\" (UID: \"e202e5af-1037-43ae-968e-0e594828048e\") " Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.043495 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e202e5af-1037-43ae-968e-0e594828048e-kube-api-access\") pod \"e202e5af-1037-43ae-968e-0e594828048e\" (UID: \"e202e5af-1037-43ae-968e-0e594828048e\") " Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.043549 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e202e5af-1037-43ae-968e-0e594828048e-kubelet-dir\") pod \"e202e5af-1037-43ae-968e-0e594828048e\" (UID: \"e202e5af-1037-43ae-968e-0e594828048e\") " Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.043754 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e202e5af-1037-43ae-968e-0e594828048e-var-lock" (OuterVolumeSpecName: "var-lock") pod "e202e5af-1037-43ae-968e-0e594828048e" (UID: "e202e5af-1037-43ae-968e-0e594828048e"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.043859 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e202e5af-1037-43ae-968e-0e594828048e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e202e5af-1037-43ae-968e-0e594828048e" (UID: "e202e5af-1037-43ae-968e-0e594828048e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.044149 5072 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e202e5af-1037-43ae-968e-0e594828048e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.044182 5072 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e202e5af-1037-43ae-968e-0e594828048e-var-lock\") on node \"crc\" DevicePath \"\"" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.050131 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e202e5af-1037-43ae-968e-0e594828048e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e202e5af-1037-43ae-968e-0e594828048e" (UID: "e202e5af-1037-43ae-968e-0e594828048e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.145501 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e202e5af-1037-43ae-968e-0e594828048e-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.579670 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.579678 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e202e5af-1037-43ae-968e-0e594828048e","Type":"ContainerDied","Data":"c96b8891c9cf5455324d50ec7bd49628a9089e778476f33bbe987de99c153012"} Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.580568 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c96b8891c9cf5455324d50ec7bd49628a9089e778476f33bbe987de99c153012" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.584951 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.586102 5072 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50" exitCode=0 Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.586208 5072 scope.go:117] "RemoveContainer" containerID="f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.586344 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.597617 5072 status_manager.go:851] "Failed to get status for pod" podUID="e202e5af-1037-43ae-968e-0e594828048e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.597965 5072 status_manager.go:851] "Failed to get status for pod" podUID="cd61caed-be31-4706-9677-0da76d2cb2e7" pod="openshift-marketplace/redhat-operators-2k7bm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2k7bm\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.599865 5072 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.614764 5072 scope.go:117] "RemoveContainer" containerID="4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.616613 5072 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.616912 5072 status_manager.go:851] "Failed to get status for pod" podUID="e202e5af-1037-43ae-968e-0e594828048e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.617256 5072 status_manager.go:851] "Failed to get status for pod" podUID="cd61caed-be31-4706-9677-0da76d2cb2e7" pod="openshift-marketplace/redhat-operators-2k7bm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2k7bm\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.640511 5072 scope.go:117] "RemoveContainer" containerID="13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.659442 5072 scope.go:117] "RemoveContainer" containerID="64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.677116 5072 scope.go:117] "RemoveContainer" containerID="f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.699048 5072 scope.go:117] "RemoveContainer" containerID="aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.720946 5072 scope.go:117] "RemoveContainer" containerID="f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4" Feb 28 04:15:01 crc kubenswrapper[5072]: E0228 04:15:01.721722 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4\": container with ID starting with f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4 not found: ID does not exist" containerID="f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.721775 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4"} err="failed to get container status \"f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4\": rpc error: code = NotFound desc = could not find container \"f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4\": container with ID starting with f0fab2df3e50a8920880af6f5dd1dbf6c1c29bee9d75a1578ec6480bd6e681c4 not found: ID does not exist" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.721810 5072 scope.go:117] "RemoveContainer" containerID="4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392" Feb 28 04:15:01 crc kubenswrapper[5072]: E0228 04:15:01.722248 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\": container with ID starting with 4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392 not found: ID does not exist" containerID="4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.722305 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392"} err="failed to get container status \"4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\": rpc error: code = NotFound desc = could not find container \"4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392\": container with ID starting with 4ccfcb64e5f9ea1a5a41d60419967ce9539d9dfc2f8d73373b72cf5037cc4392 not found: ID does not exist" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.722452 5072 scope.go:117] "RemoveContainer" containerID="13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb" Feb 28 04:15:01 crc kubenswrapper[5072]: E0228 04:15:01.722829 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\": container with ID starting with 13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb not found: ID does not exist" containerID="13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.722861 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb"} err="failed to get container status \"13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\": rpc error: code = NotFound desc = could not find container \"13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb\": container with ID starting with 13dd633b381ab7e531c3d57b781b973e355bccd91b02e49fd0509497c80fd4bb not found: ID does not exist" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.722880 5072 scope.go:117] "RemoveContainer" containerID="64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998" Feb 28 04:15:01 crc kubenswrapper[5072]: E0228 04:15:01.723728 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\": container with ID starting with 64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998 not found: ID does not exist" containerID="64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.723869 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998"} err="failed to get container status \"64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\": rpc error: code = NotFound desc = could not find container \"64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998\": container with ID starting with 64693ead27b6c23100e0bb1e5ba2ca51aa44e91f4801c4597fcfd7d9ffb52998 not found: ID does not exist" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.724003 5072 scope.go:117] "RemoveContainer" containerID="f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50" Feb 28 04:15:01 crc kubenswrapper[5072]: E0228 04:15:01.724660 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\": container with ID starting with f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50 not found: ID does not exist" containerID="f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.724688 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50"} err="failed to get container status \"f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\": rpc error: code = NotFound desc = could not find container \"f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50\": container with ID starting with f01e018bada3bd9d73cc44705b5efde7ff909611a0ad3184bf85c9e68b1efd50 not found: ID does not exist" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.724727 5072 scope.go:117] "RemoveContainer" containerID="aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4" Feb 28 04:15:01 crc kubenswrapper[5072]: E0228 04:15:01.726039 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\": container with ID starting with aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4 not found: ID does not exist" containerID="aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4" Feb 28 04:15:01 crc kubenswrapper[5072]: I0228 04:15:01.726090 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4"} err="failed to get container status \"aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\": rpc error: code = NotFound desc = could not find container \"aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4\": container with ID starting with aced7107309d40710244feb173ebe4ae2418b3278816ed8d14a2e4a1c7f895e4 not found: ID does not exist" Feb 28 04:15:02 crc kubenswrapper[5072]: E0228 04:15:02.472548 5072 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.5:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-operators-2k7bm.18984de4d1052b85 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-operators-2k7bm,UID:cd61caed-be31-4706-9677-0da76d2cb2e7,APIVersion:v1,ResourceVersion:28875,FieldPath:spec.containers{registry-server},},Reason:Killing,Message:Stopping container registry-server,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:14:58.524949381 +0000 UTC m=+320.519679613,LastTimestamp:2026-02-28 04:14:58.524949381 +0000 UTC m=+320.519679613,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:15:02 crc kubenswrapper[5072]: I0228 04:15:02.671946 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 28 04:15:03 crc kubenswrapper[5072]: E0228 04:15:03.461222 5072 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.5:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 04:15:03 crc kubenswrapper[5072]: I0228 04:15:03.462049 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 04:15:03 crc kubenswrapper[5072]: E0228 04:15:03.547084 5072 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:03 crc kubenswrapper[5072]: E0228 04:15:03.547252 5072 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:03 crc kubenswrapper[5072]: E0228 04:15:03.547979 5072 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:03 crc kubenswrapper[5072]: E0228 04:15:03.549091 5072 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:03 crc kubenswrapper[5072]: E0228 04:15:03.549981 5072 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:03 crc kubenswrapper[5072]: I0228 04:15:03.550051 5072 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 28 04:15:03 crc kubenswrapper[5072]: E0228 04:15:03.550432 5072 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" interval="200ms" Feb 28 04:15:03 crc kubenswrapper[5072]: I0228 04:15:03.604071 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"27dac4c776402271ac58cc56e522376baaec3f3b2e9928c71b591d39eb7b51c0"} Feb 28 04:15:03 crc kubenswrapper[5072]: E0228 04:15:03.751135 5072 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" interval="400ms" Feb 28 04:15:04 crc kubenswrapper[5072]: E0228 04:15:04.151848 5072 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" interval="800ms" Feb 28 04:15:04 crc kubenswrapper[5072]: I0228 04:15:04.609900 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"4404f0ffe4cc391fd16c4e88a94dfd5129fe8f999654d62638fbaa02396b78fd"} Feb 28 04:15:04 crc kubenswrapper[5072]: I0228 04:15:04.610525 5072 status_manager.go:851] "Failed to get status for pod" podUID="e202e5af-1037-43ae-968e-0e594828048e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:04 crc kubenswrapper[5072]: E0228 04:15:04.610661 5072 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.5:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 04:15:04 crc kubenswrapper[5072]: I0228 04:15:04.610849 5072 status_manager.go:851] "Failed to get status for pod" podUID="cd61caed-be31-4706-9677-0da76d2cb2e7" pod="openshift-marketplace/redhat-operators-2k7bm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2k7bm\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:04 crc kubenswrapper[5072]: E0228 04:15:04.952881 5072 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" interval="1.6s" Feb 28 04:15:05 crc kubenswrapper[5072]: E0228 04:15:05.616512 5072 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.5:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 04:15:06 crc kubenswrapper[5072]: E0228 04:15:06.554506 5072 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" interval="3.2s" Feb 28 04:15:08 crc kubenswrapper[5072]: I0228 04:15:08.662864 5072 status_manager.go:851] "Failed to get status for pod" podUID="e202e5af-1037-43ae-968e-0e594828048e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:08 crc kubenswrapper[5072]: I0228 04:15:08.663982 5072 status_manager.go:851] "Failed to get status for pod" podUID="cd61caed-be31-4706-9677-0da76d2cb2e7" pod="openshift-marketplace/redhat-operators-2k7bm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2k7bm\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:09 crc kubenswrapper[5072]: E0228 04:15:09.755877 5072 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" interval="6.4s" Feb 28 04:15:10 crc kubenswrapper[5072]: E0228 04:15:10.663188 5072 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.5:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" volumeName="registry-storage" Feb 28 04:15:12 crc kubenswrapper[5072]: E0228 04:15:12.474273 5072 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.5:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-operators-2k7bm.18984de4d1052b85 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-operators-2k7bm,UID:cd61caed-be31-4706-9677-0da76d2cb2e7,APIVersion:v1,ResourceVersion:28875,FieldPath:spec.containers{registry-server},},Reason:Killing,Message:Stopping container registry-server,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 04:14:58.524949381 +0000 UTC m=+320.519679613,LastTimestamp:2026-02-28 04:14:58.524949381 +0000 UTC m=+320.519679613,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 04:15:12 crc kubenswrapper[5072]: I0228 04:15:12.679686 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 28 04:15:12 crc kubenswrapper[5072]: I0228 04:15:12.680619 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 28 04:15:12 crc kubenswrapper[5072]: I0228 04:15:12.680885 5072 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c229773678d8bee8898338eab595e8a47dba68ae424e54532ffc8eab7f077005" exitCode=1 Feb 28 04:15:12 crc kubenswrapper[5072]: I0228 04:15:12.680962 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c229773678d8bee8898338eab595e8a47dba68ae424e54532ffc8eab7f077005"} Feb 28 04:15:12 crc kubenswrapper[5072]: I0228 04:15:12.681777 5072 scope.go:117] "RemoveContainer" containerID="c229773678d8bee8898338eab595e8a47dba68ae424e54532ffc8eab7f077005" Feb 28 04:15:12 crc kubenswrapper[5072]: I0228 04:15:12.683143 5072 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:12 crc kubenswrapper[5072]: I0228 04:15:12.683574 5072 status_manager.go:851] "Failed to get status for pod" podUID="e202e5af-1037-43ae-968e-0e594828048e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:12 crc kubenswrapper[5072]: I0228 04:15:12.685841 5072 status_manager.go:851] "Failed to get status for pod" podUID="cd61caed-be31-4706-9677-0da76d2cb2e7" pod="openshift-marketplace/redhat-operators-2k7bm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2k7bm\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:12 crc kubenswrapper[5072]: I0228 04:15:12.780935 5072 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 04:15:13 crc kubenswrapper[5072]: I0228 04:15:13.658587 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:15:13 crc kubenswrapper[5072]: I0228 04:15:13.661317 5072 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:13 crc kubenswrapper[5072]: I0228 04:15:13.661756 5072 status_manager.go:851] "Failed to get status for pod" podUID="e202e5af-1037-43ae-968e-0e594828048e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:13 crc kubenswrapper[5072]: I0228 04:15:13.662292 5072 status_manager.go:851] "Failed to get status for pod" podUID="cd61caed-be31-4706-9677-0da76d2cb2e7" pod="openshift-marketplace/redhat-operators-2k7bm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2k7bm\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:13 crc kubenswrapper[5072]: I0228 04:15:13.675843 5072 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ef530fad-4e99-4682-a3b4-604c37c2b1a8" Feb 28 04:15:13 crc kubenswrapper[5072]: I0228 04:15:13.675873 5072 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ef530fad-4e99-4682-a3b4-604c37c2b1a8" Feb 28 04:15:13 crc kubenswrapper[5072]: E0228 04:15:13.676292 5072 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:15:13 crc kubenswrapper[5072]: I0228 04:15:13.676857 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:15:13 crc kubenswrapper[5072]: I0228 04:15:13.698536 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 28 04:15:13 crc kubenswrapper[5072]: I0228 04:15:13.699305 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 28 04:15:13 crc kubenswrapper[5072]: I0228 04:15:13.699369 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c2c0a2ae35ad58e724f38ed0ce815f9d3d1ae4530efe71f93ac13ba9dc5230c0"} Feb 28 04:15:13 crc kubenswrapper[5072]: I0228 04:15:13.700522 5072 status_manager.go:851] "Failed to get status for pod" podUID="e202e5af-1037-43ae-968e-0e594828048e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:13 crc kubenswrapper[5072]: I0228 04:15:13.701156 5072 status_manager.go:851] "Failed to get status for pod" podUID="cd61caed-be31-4706-9677-0da76d2cb2e7" pod="openshift-marketplace/redhat-operators-2k7bm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2k7bm\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:13 crc kubenswrapper[5072]: I0228 04:15:13.701812 5072 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:13 crc kubenswrapper[5072]: W0228 04:15:13.716863 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-93b52eb69af6c76f3990728f0ba8d2c66b4f4558209853b1500b4105f6dc3654 WatchSource:0}: Error finding container 93b52eb69af6c76f3990728f0ba8d2c66b4f4558209853b1500b4105f6dc3654: Status 404 returned error can't find the container with id 93b52eb69af6c76f3990728f0ba8d2c66b4f4558209853b1500b4105f6dc3654 Feb 28 04:15:14 crc kubenswrapper[5072]: I0228 04:15:14.706690 5072 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="c5129ee8588435a56290186fecb4c3289c1cb5ecea2e8de3670f1ff0f084bfea" exitCode=0 Feb 28 04:15:14 crc kubenswrapper[5072]: I0228 04:15:14.706753 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"c5129ee8588435a56290186fecb4c3289c1cb5ecea2e8de3670f1ff0f084bfea"} Feb 28 04:15:14 crc kubenswrapper[5072]: I0228 04:15:14.707117 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"93b52eb69af6c76f3990728f0ba8d2c66b4f4558209853b1500b4105f6dc3654"} Feb 28 04:15:14 crc kubenswrapper[5072]: I0228 04:15:14.709163 5072 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:14 crc kubenswrapper[5072]: I0228 04:15:14.709361 5072 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ef530fad-4e99-4682-a3b4-604c37c2b1a8" Feb 28 04:15:14 crc kubenswrapper[5072]: I0228 04:15:14.709634 5072 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ef530fad-4e99-4682-a3b4-604c37c2b1a8" Feb 28 04:15:14 crc kubenswrapper[5072]: I0228 04:15:14.710113 5072 status_manager.go:851] "Failed to get status for pod" podUID="e202e5af-1037-43ae-968e-0e594828048e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:14 crc kubenswrapper[5072]: E0228 04:15:14.710161 5072 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.5:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:15:14 crc kubenswrapper[5072]: I0228 04:15:14.710709 5072 status_manager.go:851] "Failed to get status for pod" podUID="cd61caed-be31-4706-9677-0da76d2cb2e7" pod="openshift-marketplace/redhat-operators-2k7bm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2k7bm\": dial tcp 38.102.83.5:6443: connect: connection refused" Feb 28 04:15:15 crc kubenswrapper[5072]: I0228 04:15:15.730150 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4f94d566ca5d66053b4353d0631d18bd1215860714ba6b3a160310a3f4277d18"} Feb 28 04:15:15 crc kubenswrapper[5072]: I0228 04:15:15.730559 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"510a6bbb32ff38b111d4ed5bf6d4c152a9b3741312aa6d37fd5e46f9da99a89c"} Feb 28 04:15:15 crc kubenswrapper[5072]: I0228 04:15:15.730571 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9599ff057483a2fdeb53e3468dbf4569a503ff4f2e1993f59c163fcb3324c160"} Feb 28 04:15:15 crc kubenswrapper[5072]: I0228 04:15:15.730582 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"98c86029701ce1c34365ee9d6b83f979ea8ecaa669c028c034b714d80cd1d4f1"} Feb 28 04:15:16 crc kubenswrapper[5072]: I0228 04:15:16.746641 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a95f74854b3b6b6bd14adc187fbe5a87965bc7a183732f1b0c44ce1231a81562"} Feb 28 04:15:16 crc kubenswrapper[5072]: I0228 04:15:16.746893 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:15:16 crc kubenswrapper[5072]: I0228 04:15:16.746992 5072 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ef530fad-4e99-4682-a3b4-604c37c2b1a8" Feb 28 04:15:16 crc kubenswrapper[5072]: I0228 04:15:16.747016 5072 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ef530fad-4e99-4682-a3b4-604c37c2b1a8" Feb 28 04:15:18 crc kubenswrapper[5072]: I0228 04:15:18.677233 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:15:18 crc kubenswrapper[5072]: I0228 04:15:18.677439 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:15:18 crc kubenswrapper[5072]: I0228 04:15:18.687229 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.088307 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" podUID="497b9208-4958-46e8-8aeb-8bc2e0f172d6" containerName="oauth-openshift" containerID="cri-o://4e30009cfd4d02a210889216b118c12167b9b3aa481cf292922af1510622cd8a" gracePeriod=15 Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.530434 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.561375 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.631806 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-router-certs\") pod \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.631860 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/497b9208-4958-46e8-8aeb-8bc2e0f172d6-audit-policies\") pod \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.631886 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-user-idp-0-file-data\") pod \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.631928 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-user-template-provider-selection\") pod \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.631977 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7fgw\" (UniqueName: \"kubernetes.io/projected/497b9208-4958-46e8-8aeb-8bc2e0f172d6-kube-api-access-p7fgw\") pod \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.632013 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-trusted-ca-bundle\") pod \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.632033 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-ocp-branding-template\") pod \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.632052 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-serving-cert\") pod \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.632076 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-session\") pod \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.632098 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-user-template-error\") pod \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.632117 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-cliconfig\") pod \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.632136 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/497b9208-4958-46e8-8aeb-8bc2e0f172d6-audit-dir\") pod \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.632167 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-service-ca\") pod \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.632188 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-user-template-login\") pod \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\" (UID: \"497b9208-4958-46e8-8aeb-8bc2e0f172d6\") " Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.633051 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/497b9208-4958-46e8-8aeb-8bc2e0f172d6-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "497b9208-4958-46e8-8aeb-8bc2e0f172d6" (UID: "497b9208-4958-46e8-8aeb-8bc2e0f172d6"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.633233 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/497b9208-4958-46e8-8aeb-8bc2e0f172d6-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "497b9208-4958-46e8-8aeb-8bc2e0f172d6" (UID: "497b9208-4958-46e8-8aeb-8bc2e0f172d6"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.633504 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "497b9208-4958-46e8-8aeb-8bc2e0f172d6" (UID: "497b9208-4958-46e8-8aeb-8bc2e0f172d6"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.633560 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "497b9208-4958-46e8-8aeb-8bc2e0f172d6" (UID: "497b9208-4958-46e8-8aeb-8bc2e0f172d6"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.633604 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "497b9208-4958-46e8-8aeb-8bc2e0f172d6" (UID: "497b9208-4958-46e8-8aeb-8bc2e0f172d6"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.639294 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "497b9208-4958-46e8-8aeb-8bc2e0f172d6" (UID: "497b9208-4958-46e8-8aeb-8bc2e0f172d6"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.640164 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/497b9208-4958-46e8-8aeb-8bc2e0f172d6-kube-api-access-p7fgw" (OuterVolumeSpecName: "kube-api-access-p7fgw") pod "497b9208-4958-46e8-8aeb-8bc2e0f172d6" (UID: "497b9208-4958-46e8-8aeb-8bc2e0f172d6"). InnerVolumeSpecName "kube-api-access-p7fgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.640277 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "497b9208-4958-46e8-8aeb-8bc2e0f172d6" (UID: "497b9208-4958-46e8-8aeb-8bc2e0f172d6"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.649765 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "497b9208-4958-46e8-8aeb-8bc2e0f172d6" (UID: "497b9208-4958-46e8-8aeb-8bc2e0f172d6"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.649858 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "497b9208-4958-46e8-8aeb-8bc2e0f172d6" (UID: "497b9208-4958-46e8-8aeb-8bc2e0f172d6"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.651824 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "497b9208-4958-46e8-8aeb-8bc2e0f172d6" (UID: "497b9208-4958-46e8-8aeb-8bc2e0f172d6"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.652372 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "497b9208-4958-46e8-8aeb-8bc2e0f172d6" (UID: "497b9208-4958-46e8-8aeb-8bc2e0f172d6"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.653816 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "497b9208-4958-46e8-8aeb-8bc2e0f172d6" (UID: "497b9208-4958-46e8-8aeb-8bc2e0f172d6"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.658057 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "497b9208-4958-46e8-8aeb-8bc2e0f172d6" (UID: "497b9208-4958-46e8-8aeb-8bc2e0f172d6"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.733938 5072 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.734012 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7fgw\" (UniqueName: \"kubernetes.io/projected/497b9208-4958-46e8-8aeb-8bc2e0f172d6-kube-api-access-p7fgw\") on node \"crc\" DevicePath \"\"" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.734037 5072 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.734061 5072 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.734082 5072 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.734102 5072 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.734124 5072 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.734145 5072 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.734168 5072 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/497b9208-4958-46e8-8aeb-8bc2e0f172d6-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.734186 5072 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.734208 5072 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.734227 5072 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.734244 5072 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/497b9208-4958-46e8-8aeb-8bc2e0f172d6-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.734315 5072 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/497b9208-4958-46e8-8aeb-8bc2e0f172d6-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.773323 5072 generic.go:334] "Generic (PLEG): container finished" podID="497b9208-4958-46e8-8aeb-8bc2e0f172d6" containerID="4e30009cfd4d02a210889216b118c12167b9b3aa481cf292922af1510622cd8a" exitCode=0 Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.773383 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" event={"ID":"497b9208-4958-46e8-8aeb-8bc2e0f172d6","Type":"ContainerDied","Data":"4e30009cfd4d02a210889216b118c12167b9b3aa481cf292922af1510622cd8a"} Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.773416 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" event={"ID":"497b9208-4958-46e8-8aeb-8bc2e0f172d6","Type":"ContainerDied","Data":"a65b19f5331f6726b3e322abd80c9079762ea319beea932246d37b929ef53c82"} Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.773418 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4d9s6" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.773477 5072 scope.go:117] "RemoveContainer" containerID="4e30009cfd4d02a210889216b118c12167b9b3aa481cf292922af1510622cd8a" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.788821 5072 scope.go:117] "RemoveContainer" containerID="4e30009cfd4d02a210889216b118c12167b9b3aa481cf292922af1510622cd8a" Feb 28 04:15:20 crc kubenswrapper[5072]: E0228 04:15:20.789071 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e30009cfd4d02a210889216b118c12167b9b3aa481cf292922af1510622cd8a\": container with ID starting with 4e30009cfd4d02a210889216b118c12167b9b3aa481cf292922af1510622cd8a not found: ID does not exist" containerID="4e30009cfd4d02a210889216b118c12167b9b3aa481cf292922af1510622cd8a" Feb 28 04:15:20 crc kubenswrapper[5072]: I0228 04:15:20.789103 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e30009cfd4d02a210889216b118c12167b9b3aa481cf292922af1510622cd8a"} err="failed to get container status \"4e30009cfd4d02a210889216b118c12167b9b3aa481cf292922af1510622cd8a\": rpc error: code = NotFound desc = could not find container \"4e30009cfd4d02a210889216b118c12167b9b3aa481cf292922af1510622cd8a\": container with ID starting with 4e30009cfd4d02a210889216b118c12167b9b3aa481cf292922af1510622cd8a not found: ID does not exist" Feb 28 04:15:21 crc kubenswrapper[5072]: I0228 04:15:21.405665 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 04:15:21 crc kubenswrapper[5072]: I0228 04:15:21.406206 5072 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 28 04:15:21 crc kubenswrapper[5072]: I0228 04:15:21.406278 5072 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 28 04:15:21 crc kubenswrapper[5072]: I0228 04:15:21.756636 5072 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:15:21 crc kubenswrapper[5072]: I0228 04:15:21.781018 5072 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ef530fad-4e99-4682-a3b4-604c37c2b1a8" Feb 28 04:15:21 crc kubenswrapper[5072]: I0228 04:15:21.781063 5072 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ef530fad-4e99-4682-a3b4-604c37c2b1a8" Feb 28 04:15:21 crc kubenswrapper[5072]: I0228 04:15:21.787089 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:15:21 crc kubenswrapper[5072]: E0228 04:15:21.881106 5072 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-cliconfig\": Failed to watch *v1.ConfigMap: unknown (get configmaps)" logger="UnhandledError" Feb 28 04:15:21 crc kubenswrapper[5072]: I0228 04:15:21.883683 5072 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ca4443ba-b04a-4ad2-bd3d-86b7df9f87c2" Feb 28 04:15:22 crc kubenswrapper[5072]: E0228 04:15:22.244219 5072 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-service-ca\": Failed to watch *v1.ConfigMap: unknown (get configmaps)" logger="UnhandledError" Feb 28 04:15:22 crc kubenswrapper[5072]: I0228 04:15:22.786375 5072 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ef530fad-4e99-4682-a3b4-604c37c2b1a8" Feb 28 04:15:22 crc kubenswrapper[5072]: I0228 04:15:22.786418 5072 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ef530fad-4e99-4682-a3b4-604c37c2b1a8" Feb 28 04:15:22 crc kubenswrapper[5072]: I0228 04:15:22.790772 5072 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ca4443ba-b04a-4ad2-bd3d-86b7df9f87c2" Feb 28 04:15:31 crc kubenswrapper[5072]: I0228 04:15:31.405965 5072 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 28 04:15:31 crc kubenswrapper[5072]: I0228 04:15:31.406473 5072 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 28 04:15:31 crc kubenswrapper[5072]: I0228 04:15:31.406294 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 28 04:15:32 crc kubenswrapper[5072]: I0228 04:15:32.504296 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 28 04:15:32 crc kubenswrapper[5072]: I0228 04:15:32.573784 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 28 04:15:32 crc kubenswrapper[5072]: I0228 04:15:32.628782 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 28 04:15:32 crc kubenswrapper[5072]: I0228 04:15:32.903469 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 28 04:15:32 crc kubenswrapper[5072]: I0228 04:15:32.924445 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 28 04:15:32 crc kubenswrapper[5072]: I0228 04:15:32.975942 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 28 04:15:33 crc kubenswrapper[5072]: I0228 04:15:33.070544 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 28 04:15:33 crc kubenswrapper[5072]: I0228 04:15:33.167995 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 28 04:15:33 crc kubenswrapper[5072]: I0228 04:15:33.314668 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 28 04:15:33 crc kubenswrapper[5072]: I0228 04:15:33.394420 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 28 04:15:33 crc kubenswrapper[5072]: I0228 04:15:33.510693 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 28 04:15:33 crc kubenswrapper[5072]: I0228 04:15:33.546195 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 28 04:15:33 crc kubenswrapper[5072]: I0228 04:15:33.819264 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 28 04:15:33 crc kubenswrapper[5072]: I0228 04:15:33.945156 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 28 04:15:34 crc kubenswrapper[5072]: I0228 04:15:34.200515 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 28 04:15:34 crc kubenswrapper[5072]: I0228 04:15:34.334144 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 28 04:15:34 crc kubenswrapper[5072]: I0228 04:15:34.405745 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 28 04:15:34 crc kubenswrapper[5072]: I0228 04:15:34.485393 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 28 04:15:34 crc kubenswrapper[5072]: I0228 04:15:34.497788 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 28 04:15:34 crc kubenswrapper[5072]: I0228 04:15:34.560126 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 28 04:15:34 crc kubenswrapper[5072]: I0228 04:15:34.659459 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 28 04:15:34 crc kubenswrapper[5072]: I0228 04:15:34.747240 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 28 04:15:34 crc kubenswrapper[5072]: I0228 04:15:34.790110 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 28 04:15:34 crc kubenswrapper[5072]: I0228 04:15:34.797000 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 28 04:15:34 crc kubenswrapper[5072]: I0228 04:15:34.824740 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 28 04:15:34 crc kubenswrapper[5072]: I0228 04:15:34.871317 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 28 04:15:34 crc kubenswrapper[5072]: I0228 04:15:34.913121 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 28 04:15:35 crc kubenswrapper[5072]: I0228 04:15:35.243801 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 28 04:15:35 crc kubenswrapper[5072]: I0228 04:15:35.247020 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 28 04:15:35 crc kubenswrapper[5072]: I0228 04:15:35.265356 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 28 04:15:35 crc kubenswrapper[5072]: I0228 04:15:35.265540 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 28 04:15:35 crc kubenswrapper[5072]: I0228 04:15:35.341333 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 28 04:15:35 crc kubenswrapper[5072]: I0228 04:15:35.410753 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 28 04:15:35 crc kubenswrapper[5072]: I0228 04:15:35.423279 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 28 04:15:35 crc kubenswrapper[5072]: I0228 04:15:35.487057 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 28 04:15:35 crc kubenswrapper[5072]: I0228 04:15:35.490242 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 28 04:15:35 crc kubenswrapper[5072]: I0228 04:15:35.574968 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 28 04:15:35 crc kubenswrapper[5072]: I0228 04:15:35.633280 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 28 04:15:35 crc kubenswrapper[5072]: I0228 04:15:35.655008 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 28 04:15:35 crc kubenswrapper[5072]: I0228 04:15:35.733251 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 28 04:15:35 crc kubenswrapper[5072]: I0228 04:15:35.808860 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 28 04:15:35 crc kubenswrapper[5072]: I0228 04:15:35.817604 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 28 04:15:35 crc kubenswrapper[5072]: I0228 04:15:35.844244 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 28 04:15:35 crc kubenswrapper[5072]: I0228 04:15:35.942733 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 28 04:15:36 crc kubenswrapper[5072]: I0228 04:15:36.169483 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 28 04:15:36 crc kubenswrapper[5072]: I0228 04:15:36.331605 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 28 04:15:36 crc kubenswrapper[5072]: I0228 04:15:36.367744 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 28 04:15:36 crc kubenswrapper[5072]: I0228 04:15:36.424699 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 28 04:15:36 crc kubenswrapper[5072]: I0228 04:15:36.460165 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 28 04:15:36 crc kubenswrapper[5072]: I0228 04:15:36.546035 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 28 04:15:36 crc kubenswrapper[5072]: I0228 04:15:36.584564 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 28 04:15:36 crc kubenswrapper[5072]: I0228 04:15:36.687607 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 28 04:15:36 crc kubenswrapper[5072]: I0228 04:15:36.857892 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 28 04:15:36 crc kubenswrapper[5072]: I0228 04:15:36.904614 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 28 04:15:36 crc kubenswrapper[5072]: I0228 04:15:36.941818 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 28 04:15:36 crc kubenswrapper[5072]: I0228 04:15:36.981161 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 28 04:15:37 crc kubenswrapper[5072]: I0228 04:15:37.017379 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 28 04:15:37 crc kubenswrapper[5072]: I0228 04:15:37.056105 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 28 04:15:37 crc kubenswrapper[5072]: I0228 04:15:37.077294 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 28 04:15:37 crc kubenswrapper[5072]: I0228 04:15:37.138155 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 28 04:15:37 crc kubenswrapper[5072]: I0228 04:15:37.276952 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 28 04:15:37 crc kubenswrapper[5072]: I0228 04:15:37.322460 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 28 04:15:37 crc kubenswrapper[5072]: I0228 04:15:37.333058 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 28 04:15:37 crc kubenswrapper[5072]: I0228 04:15:37.364109 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 28 04:15:37 crc kubenswrapper[5072]: I0228 04:15:37.431088 5072 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 28 04:15:37 crc kubenswrapper[5072]: I0228 04:15:37.432155 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 28 04:15:37 crc kubenswrapper[5072]: I0228 04:15:37.483482 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 28 04:15:37 crc kubenswrapper[5072]: I0228 04:15:37.606727 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 28 04:15:37 crc kubenswrapper[5072]: I0228 04:15:37.958382 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 28 04:15:37 crc kubenswrapper[5072]: I0228 04:15:37.964231 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 28 04:15:38 crc kubenswrapper[5072]: I0228 04:15:38.056692 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 28 04:15:38 crc kubenswrapper[5072]: I0228 04:15:38.113306 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 28 04:15:38 crc kubenswrapper[5072]: I0228 04:15:38.163719 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 28 04:15:38 crc kubenswrapper[5072]: I0228 04:15:38.179563 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 28 04:15:38 crc kubenswrapper[5072]: I0228 04:15:38.184009 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 28 04:15:38 crc kubenswrapper[5072]: I0228 04:15:38.269849 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 28 04:15:38 crc kubenswrapper[5072]: I0228 04:15:38.353335 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 28 04:15:38 crc kubenswrapper[5072]: I0228 04:15:38.529084 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 28 04:15:38 crc kubenswrapper[5072]: I0228 04:15:38.529524 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 28 04:15:38 crc kubenswrapper[5072]: I0228 04:15:38.577843 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 28 04:15:38 crc kubenswrapper[5072]: I0228 04:15:38.635075 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 28 04:15:38 crc kubenswrapper[5072]: I0228 04:15:38.648481 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 28 04:15:38 crc kubenswrapper[5072]: I0228 04:15:38.792230 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 28 04:15:38 crc kubenswrapper[5072]: I0228 04:15:38.857697 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 28 04:15:38 crc kubenswrapper[5072]: I0228 04:15:38.994725 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 28 04:15:39 crc kubenswrapper[5072]: I0228 04:15:39.070819 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 28 04:15:39 crc kubenswrapper[5072]: I0228 04:15:39.106809 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 28 04:15:39 crc kubenswrapper[5072]: I0228 04:15:39.170116 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 28 04:15:39 crc kubenswrapper[5072]: I0228 04:15:39.271426 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 28 04:15:39 crc kubenswrapper[5072]: I0228 04:15:39.370992 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 28 04:15:39 crc kubenswrapper[5072]: I0228 04:15:39.417263 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 28 04:15:39 crc kubenswrapper[5072]: I0228 04:15:39.423889 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 28 04:15:39 crc kubenswrapper[5072]: I0228 04:15:39.469137 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 28 04:15:39 crc kubenswrapper[5072]: I0228 04:15:39.499178 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 28 04:15:39 crc kubenswrapper[5072]: I0228 04:15:39.540229 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 28 04:15:39 crc kubenswrapper[5072]: I0228 04:15:39.553520 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 28 04:15:39 crc kubenswrapper[5072]: I0228 04:15:39.598780 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 28 04:15:39 crc kubenswrapper[5072]: I0228 04:15:39.663362 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 28 04:15:40 crc kubenswrapper[5072]: I0228 04:15:40.499778 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 28 04:15:40 crc kubenswrapper[5072]: I0228 04:15:40.508434 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 28 04:15:40 crc kubenswrapper[5072]: I0228 04:15:40.509148 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 28 04:15:40 crc kubenswrapper[5072]: I0228 04:15:40.512240 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 28 04:15:40 crc kubenswrapper[5072]: I0228 04:15:40.513489 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 28 04:15:40 crc kubenswrapper[5072]: I0228 04:15:40.513506 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 28 04:15:40 crc kubenswrapper[5072]: I0228 04:15:40.514111 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 28 04:15:40 crc kubenswrapper[5072]: I0228 04:15:40.514131 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 28 04:15:40 crc kubenswrapper[5072]: I0228 04:15:40.514154 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 28 04:15:40 crc kubenswrapper[5072]: I0228 04:15:40.514187 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 28 04:15:40 crc kubenswrapper[5072]: I0228 04:15:40.514301 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 28 04:15:40 crc kubenswrapper[5072]: I0228 04:15:40.517491 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 28 04:15:40 crc kubenswrapper[5072]: I0228 04:15:40.517955 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 28 04:15:40 crc kubenswrapper[5072]: I0228 04:15:40.527314 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 28 04:15:40 crc kubenswrapper[5072]: I0228 04:15:40.537027 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 28 04:15:40 crc kubenswrapper[5072]: I0228 04:15:40.537489 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 28 04:15:40 crc kubenswrapper[5072]: I0228 04:15:40.537719 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 28 04:15:40 crc kubenswrapper[5072]: I0228 04:15:40.542373 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 28 04:15:40 crc kubenswrapper[5072]: I0228 04:15:40.570927 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 28 04:15:40 crc kubenswrapper[5072]: I0228 04:15:40.577041 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 28 04:15:40 crc kubenswrapper[5072]: I0228 04:15:40.626989 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 28 04:15:40 crc kubenswrapper[5072]: I0228 04:15:40.631075 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 28 04:15:40 crc kubenswrapper[5072]: I0228 04:15:40.671515 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 28 04:15:40 crc kubenswrapper[5072]: I0228 04:15:40.699080 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 28 04:15:40 crc kubenswrapper[5072]: I0228 04:15:40.729042 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 28 04:15:40 crc kubenswrapper[5072]: I0228 04:15:40.826220 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 28 04:15:40 crc kubenswrapper[5072]: I0228 04:15:40.864347 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 28 04:15:40 crc kubenswrapper[5072]: I0228 04:15:40.944373 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 28 04:15:40 crc kubenswrapper[5072]: I0228 04:15:40.975141 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 28 04:15:40 crc kubenswrapper[5072]: I0228 04:15:40.995051 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 28 04:15:41 crc kubenswrapper[5072]: I0228 04:15:41.032763 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 28 04:15:41 crc kubenswrapper[5072]: I0228 04:15:41.050138 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 28 04:15:41 crc kubenswrapper[5072]: I0228 04:15:41.102994 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 28 04:15:41 crc kubenswrapper[5072]: I0228 04:15:41.125840 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 28 04:15:41 crc kubenswrapper[5072]: I0228 04:15:41.196250 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 28 04:15:41 crc kubenswrapper[5072]: I0228 04:15:41.361878 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 28 04:15:41 crc kubenswrapper[5072]: I0228 04:15:41.392515 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 28 04:15:41 crc kubenswrapper[5072]: I0228 04:15:41.405955 5072 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 28 04:15:41 crc kubenswrapper[5072]: I0228 04:15:41.406089 5072 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 28 04:15:41 crc kubenswrapper[5072]: I0228 04:15:41.406194 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 04:15:41 crc kubenswrapper[5072]: I0228 04:15:41.407458 5072 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"c2c0a2ae35ad58e724f38ed0ce815f9d3d1ae4530efe71f93ac13ba9dc5230c0"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Feb 28 04:15:41 crc kubenswrapper[5072]: I0228 04:15:41.407703 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://c2c0a2ae35ad58e724f38ed0ce815f9d3d1ae4530efe71f93ac13ba9dc5230c0" gracePeriod=30 Feb 28 04:15:41 crc kubenswrapper[5072]: I0228 04:15:41.443122 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 28 04:15:41 crc kubenswrapper[5072]: I0228 04:15:41.596935 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 28 04:15:41 crc kubenswrapper[5072]: I0228 04:15:41.603139 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 28 04:15:41 crc kubenswrapper[5072]: I0228 04:15:41.652441 5072 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 28 04:15:41 crc kubenswrapper[5072]: I0228 04:15:41.675399 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 28 04:15:41 crc kubenswrapper[5072]: I0228 04:15:41.694951 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 28 04:15:41 crc kubenswrapper[5072]: I0228 04:15:41.850807 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 28 04:15:42 crc kubenswrapper[5072]: I0228 04:15:42.013873 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 28 04:15:42 crc kubenswrapper[5072]: I0228 04:15:42.094323 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 28 04:15:42 crc kubenswrapper[5072]: I0228 04:15:42.113790 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 28 04:15:42 crc kubenswrapper[5072]: I0228 04:15:42.203469 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 28 04:15:42 crc kubenswrapper[5072]: I0228 04:15:42.220226 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 28 04:15:42 crc kubenswrapper[5072]: I0228 04:15:42.329043 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 28 04:15:42 crc kubenswrapper[5072]: I0228 04:15:42.425033 5072 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 28 04:15:42 crc kubenswrapper[5072]: I0228 04:15:42.479610 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 28 04:15:42 crc kubenswrapper[5072]: I0228 04:15:42.501481 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 28 04:15:42 crc kubenswrapper[5072]: I0228 04:15:42.532269 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 28 04:15:42 crc kubenswrapper[5072]: I0228 04:15:42.533488 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 28 04:15:42 crc kubenswrapper[5072]: I0228 04:15:42.594412 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 28 04:15:42 crc kubenswrapper[5072]: I0228 04:15:42.594523 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 28 04:15:42 crc kubenswrapper[5072]: I0228 04:15:42.605327 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 28 04:15:42 crc kubenswrapper[5072]: I0228 04:15:42.627954 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 28 04:15:42 crc kubenswrapper[5072]: I0228 04:15:42.638265 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 28 04:15:42 crc kubenswrapper[5072]: I0228 04:15:42.710039 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 28 04:15:42 crc kubenswrapper[5072]: I0228 04:15:42.793169 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 28 04:15:42 crc kubenswrapper[5072]: I0228 04:15:42.821280 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 28 04:15:42 crc kubenswrapper[5072]: I0228 04:15:42.876492 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 28 04:15:42 crc kubenswrapper[5072]: I0228 04:15:42.902397 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.000906 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.011894 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.094087 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.114755 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.143485 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.148700 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.184319 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.190374 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.190470 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.336326 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.489679 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.560111 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.681080 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.687262 5072 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.691261 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-4d9s6","openshift-marketplace/redhat-operators-2k7bm"] Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.691336 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6"] Feb 28 04:15:43 crc kubenswrapper[5072]: E0228 04:15:43.691494 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e202e5af-1037-43ae-968e-0e594828048e" containerName="installer" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.691514 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="e202e5af-1037-43ae-968e-0e594828048e" containerName="installer" Feb 28 04:15:43 crc kubenswrapper[5072]: E0228 04:15:43.691530 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd61caed-be31-4706-9677-0da76d2cb2e7" containerName="registry-server" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.691535 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd61caed-be31-4706-9677-0da76d2cb2e7" containerName="registry-server" Feb 28 04:15:43 crc kubenswrapper[5072]: E0228 04:15:43.691544 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd61caed-be31-4706-9677-0da76d2cb2e7" containerName="extract-content" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.691550 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd61caed-be31-4706-9677-0da76d2cb2e7" containerName="extract-content" Feb 28 04:15:43 crc kubenswrapper[5072]: E0228 04:15:43.691561 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd61caed-be31-4706-9677-0da76d2cb2e7" containerName="extract-utilities" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.691566 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd61caed-be31-4706-9677-0da76d2cb2e7" containerName="extract-utilities" Feb 28 04:15:43 crc kubenswrapper[5072]: E0228 04:15:43.691574 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="497b9208-4958-46e8-8aeb-8bc2e0f172d6" containerName="oauth-openshift" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.691579 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="497b9208-4958-46e8-8aeb-8bc2e0f172d6" containerName="oauth-openshift" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.691781 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd61caed-be31-4706-9677-0da76d2cb2e7" containerName="registry-server" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.691808 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="e202e5af-1037-43ae-968e-0e594828048e" containerName="installer" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.691818 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="497b9208-4958-46e8-8aeb-8bc2e0f172d6" containerName="oauth-openshift" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.691821 5072 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ef530fad-4e99-4682-a3b4-604c37c2b1a8" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.691839 5072 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ef530fad-4e99-4682-a3b4-604c37c2b1a8" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.692278 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.694214 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.694765 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.695254 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.695314 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.695759 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.696583 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.696843 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.696932 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.697028 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.697834 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.697877 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.698045 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.698891 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.702838 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.703410 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.703457 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-system-session\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.703479 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-audit-dir\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.703496 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-user-template-error\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.703516 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-user-template-login\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.703536 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.703556 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.703572 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.703591 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-system-router-certs\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.703619 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.704404 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck95b\" (UniqueName: \"kubernetes.io/projected/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-kube-api-access-ck95b\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.704439 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-system-service-ca\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.704469 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-audit-policies\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.704486 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.709828 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.716910 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.724617 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=22.724589219 podStartE2EDuration="22.724589219s" podCreationTimestamp="2026-02-28 04:15:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:15:43.717736786 +0000 UTC m=+365.712467008" watchObservedRunningTime="2026-02-28 04:15:43.724589219 +0000 UTC m=+365.719319411" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.759393 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.805034 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-system-router-certs\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.805082 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.805112 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck95b\" (UniqueName: \"kubernetes.io/projected/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-kube-api-access-ck95b\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.805147 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-system-service-ca\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.805519 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-audit-policies\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.805545 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.806389 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-audit-policies\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.806458 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.806486 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-system-session\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.806782 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-audit-dir\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.806829 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-user-template-error\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.806845 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-system-service-ca\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.806857 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-user-template-login\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.806905 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-audit-dir\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.807020 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.807117 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.807150 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.808578 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.809107 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.815183 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-user-template-error\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.815195 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.815302 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-user-template-login\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.816085 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.816518 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-system-session\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.816945 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-system-router-certs\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.817166 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.817333 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.820091 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck95b\" (UniqueName: \"kubernetes.io/projected/16c977f0-5b1c-4d5a-9f5a-a5820f55e33f-kube-api-access-ck95b\") pod \"oauth-openshift-5c6d75d75f-tvzj6\" (UID: \"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f\") " pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.830509 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.923758 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.956372 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.960256 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 28 04:15:43 crc kubenswrapper[5072]: I0228 04:15:43.975388 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 28 04:15:44 crc kubenswrapper[5072]: I0228 04:15:44.009260 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:44 crc kubenswrapper[5072]: I0228 04:15:44.139792 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 28 04:15:44 crc kubenswrapper[5072]: I0228 04:15:44.198233 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 28 04:15:44 crc kubenswrapper[5072]: I0228 04:15:44.215807 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 28 04:15:44 crc kubenswrapper[5072]: I0228 04:15:44.241010 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 28 04:15:44 crc kubenswrapper[5072]: I0228 04:15:44.324744 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 28 04:15:44 crc kubenswrapper[5072]: I0228 04:15:44.363728 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 28 04:15:44 crc kubenswrapper[5072]: I0228 04:15:44.399348 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6"] Feb 28 04:15:44 crc kubenswrapper[5072]: I0228 04:15:44.399656 5072 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 28 04:15:44 crc kubenswrapper[5072]: I0228 04:15:44.399898 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://4404f0ffe4cc391fd16c4e88a94dfd5129fe8f999654d62638fbaa02396b78fd" gracePeriod=5 Feb 28 04:15:44 crc kubenswrapper[5072]: I0228 04:15:44.413577 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 28 04:15:44 crc kubenswrapper[5072]: I0228 04:15:44.419297 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 28 04:15:44 crc kubenswrapper[5072]: I0228 04:15:44.495729 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" event={"ID":"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f","Type":"ContainerStarted","Data":"90979263bf32d2ed2ef3a13c3892a1fff7c73389a128ea965d6d5dfd3eed44c3"} Feb 28 04:15:44 crc kubenswrapper[5072]: I0228 04:15:44.662457 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 28 04:15:44 crc kubenswrapper[5072]: I0228 04:15:44.666032 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="497b9208-4958-46e8-8aeb-8bc2e0f172d6" path="/var/lib/kubelet/pods/497b9208-4958-46e8-8aeb-8bc2e0f172d6/volumes" Feb 28 04:15:44 crc kubenswrapper[5072]: I0228 04:15:44.667147 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd61caed-be31-4706-9677-0da76d2cb2e7" path="/var/lib/kubelet/pods/cd61caed-be31-4706-9677-0da76d2cb2e7/volumes" Feb 28 04:15:44 crc kubenswrapper[5072]: I0228 04:15:44.794486 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 28 04:15:44 crc kubenswrapper[5072]: I0228 04:15:44.808768 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 28 04:15:44 crc kubenswrapper[5072]: I0228 04:15:44.905102 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 28 04:15:44 crc kubenswrapper[5072]: I0228 04:15:44.905254 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 28 04:15:45 crc kubenswrapper[5072]: I0228 04:15:45.011775 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 28 04:15:45 crc kubenswrapper[5072]: I0228 04:15:45.129563 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 28 04:15:45 crc kubenswrapper[5072]: I0228 04:15:45.197286 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 28 04:15:45 crc kubenswrapper[5072]: I0228 04:15:45.475868 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 28 04:15:45 crc kubenswrapper[5072]: I0228 04:15:45.693263 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 28 04:15:45 crc kubenswrapper[5072]: I0228 04:15:45.693989 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 28 04:15:45 crc kubenswrapper[5072]: I0228 04:15:45.695978 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 28 04:15:45 crc kubenswrapper[5072]: I0228 04:15:45.702517 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" event={"ID":"16c977f0-5b1c-4d5a-9f5a-a5820f55e33f","Type":"ContainerStarted","Data":"5b6ddd1da5be25eb59c035ceb1b1efb983e9ceee0ed352e76a3c80d070bfca42"} Feb 28 04:15:45 crc kubenswrapper[5072]: I0228 04:15:45.702745 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:45 crc kubenswrapper[5072]: I0228 04:15:45.704962 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 28 04:15:45 crc kubenswrapper[5072]: I0228 04:15:45.723073 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" podStartSLOduration=50.723050755 podStartE2EDuration="50.723050755s" podCreationTimestamp="2026-02-28 04:14:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:15:45.721951341 +0000 UTC m=+367.716681533" watchObservedRunningTime="2026-02-28 04:15:45.723050755 +0000 UTC m=+367.717780947" Feb 28 04:15:45 crc kubenswrapper[5072]: I0228 04:15:45.770755 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5c6d75d75f-tvzj6" Feb 28 04:15:45 crc kubenswrapper[5072]: I0228 04:15:45.989664 5072 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 28 04:15:46 crc kubenswrapper[5072]: I0228 04:15:46.031134 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 28 04:15:46 crc kubenswrapper[5072]: I0228 04:15:46.054203 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 28 04:15:46 crc kubenswrapper[5072]: I0228 04:15:46.139293 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 28 04:15:46 crc kubenswrapper[5072]: I0228 04:15:46.208066 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 28 04:15:46 crc kubenswrapper[5072]: I0228 04:15:46.217973 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 28 04:15:46 crc kubenswrapper[5072]: I0228 04:15:46.338816 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 28 04:15:46 crc kubenswrapper[5072]: I0228 04:15:46.462074 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 28 04:15:46 crc kubenswrapper[5072]: I0228 04:15:46.561376 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 28 04:15:46 crc kubenswrapper[5072]: I0228 04:15:46.828495 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 28 04:15:46 crc kubenswrapper[5072]: I0228 04:15:46.973171 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 28 04:15:46 crc kubenswrapper[5072]: I0228 04:15:46.981489 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 28 04:15:47 crc kubenswrapper[5072]: I0228 04:15:47.006635 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 28 04:15:47 crc kubenswrapper[5072]: I0228 04:15:47.100249 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 28 04:15:47 crc kubenswrapper[5072]: I0228 04:15:47.224725 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 28 04:15:47 crc kubenswrapper[5072]: I0228 04:15:47.264809 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 28 04:15:47 crc kubenswrapper[5072]: I0228 04:15:47.757368 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 28 04:15:49 crc kubenswrapper[5072]: I0228 04:15:49.725864 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 28 04:15:49 crc kubenswrapper[5072]: I0228 04:15:49.725908 5072 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="4404f0ffe4cc391fd16c4e88a94dfd5129fe8f999654d62638fbaa02396b78fd" exitCode=137 Feb 28 04:15:49 crc kubenswrapper[5072]: I0228 04:15:49.991970 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 28 04:15:49 crc kubenswrapper[5072]: I0228 04:15:49.992040 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 04:15:49 crc kubenswrapper[5072]: I0228 04:15:49.994598 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 28 04:15:49 crc kubenswrapper[5072]: I0228 04:15:49.994657 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 28 04:15:49 crc kubenswrapper[5072]: I0228 04:15:49.994679 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 04:15:49 crc kubenswrapper[5072]: I0228 04:15:49.994703 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 28 04:15:49 crc kubenswrapper[5072]: I0228 04:15:49.994722 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 04:15:49 crc kubenswrapper[5072]: I0228 04:15:49.994734 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 28 04:15:49 crc kubenswrapper[5072]: I0228 04:15:49.994759 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 28 04:15:49 crc kubenswrapper[5072]: I0228 04:15:49.994848 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 04:15:49 crc kubenswrapper[5072]: I0228 04:15:49.994909 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 04:15:49 crc kubenswrapper[5072]: I0228 04:15:49.995069 5072 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 28 04:15:49 crc kubenswrapper[5072]: I0228 04:15:49.995080 5072 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 28 04:15:49 crc kubenswrapper[5072]: I0228 04:15:49.995089 5072 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 28 04:15:49 crc kubenswrapper[5072]: I0228 04:15:49.995097 5072 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 28 04:15:50 crc kubenswrapper[5072]: I0228 04:15:50.002376 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 04:15:50 crc kubenswrapper[5072]: I0228 04:15:50.095633 5072 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 28 04:15:50 crc kubenswrapper[5072]: I0228 04:15:50.665368 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 28 04:15:50 crc kubenswrapper[5072]: I0228 04:15:50.732187 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 28 04:15:50 crc kubenswrapper[5072]: I0228 04:15:50.732261 5072 scope.go:117] "RemoveContainer" containerID="4404f0ffe4cc391fd16c4e88a94dfd5129fe8f999654d62638fbaa02396b78fd" Feb 28 04:15:50 crc kubenswrapper[5072]: I0228 04:15:50.732380 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 04:15:58 crc kubenswrapper[5072]: I0228 04:15:58.663403 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 28 04:16:00 crc kubenswrapper[5072]: I0228 04:16:00.179163 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 28 04:16:00 crc kubenswrapper[5072]: I0228 04:16:00.367930 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 28 04:16:02 crc kubenswrapper[5072]: I0228 04:16:02.582900 5072 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 28 04:16:07 crc kubenswrapper[5072]: I0228 04:16:07.901813 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 28 04:16:07 crc kubenswrapper[5072]: I0228 04:16:07.956103 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537535-6xknn"] Feb 28 04:16:07 crc kubenswrapper[5072]: E0228 04:16:07.956481 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 28 04:16:07 crc kubenswrapper[5072]: I0228 04:16:07.956500 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 28 04:16:07 crc kubenswrapper[5072]: I0228 04:16:07.956738 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 28 04:16:07 crc kubenswrapper[5072]: I0228 04:16:07.957358 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537535-6xknn" Feb 28 04:16:07 crc kubenswrapper[5072]: I0228 04:16:07.959750 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 28 04:16:07 crc kubenswrapper[5072]: I0228 04:16:07.959980 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537536-n4b67"] Feb 28 04:16:07 crc kubenswrapper[5072]: I0228 04:16:07.960354 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 28 04:16:07 crc kubenswrapper[5072]: I0228 04:16:07.961085 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537536-n4b67" Feb 28 04:16:07 crc kubenswrapper[5072]: I0228 04:16:07.966035 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-c8kvx" Feb 28 04:16:07 crc kubenswrapper[5072]: I0228 04:16:07.966056 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 04:16:07 crc kubenswrapper[5072]: I0228 04:16:07.965934 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 04:16:07 crc kubenswrapper[5072]: I0228 04:16:07.972196 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537535-6xknn"] Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.011807 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.015263 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537536-n4b67"] Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.044827 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xxlk\" (UniqueName: \"kubernetes.io/projected/55524219-a6bb-4464-954e-2a8726c25a20-kube-api-access-5xxlk\") pod \"auto-csr-approver-29537536-n4b67\" (UID: \"55524219-a6bb-4464-954e-2a8726c25a20\") " pod="openshift-infra/auto-csr-approver-29537536-n4b67" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.044923 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e21f0772-4928-4b64-a6cf-640ecebe49d0-config-volume\") pod \"collect-profiles-29537535-6xknn\" (UID: \"e21f0772-4928-4b64-a6cf-640ecebe49d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537535-6xknn" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.044954 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bhww\" (UniqueName: \"kubernetes.io/projected/e21f0772-4928-4b64-a6cf-640ecebe49d0-kube-api-access-2bhww\") pod \"collect-profiles-29537535-6xknn\" (UID: \"e21f0772-4928-4b64-a6cf-640ecebe49d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537535-6xknn" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.044983 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e21f0772-4928-4b64-a6cf-640ecebe49d0-secret-volume\") pod \"collect-profiles-29537535-6xknn\" (UID: \"e21f0772-4928-4b64-a6cf-640ecebe49d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537535-6xknn" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.087319 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-78ff6764c-7x6gx"] Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.087627 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-78ff6764c-7x6gx" podUID="c5ebae06-49d0-4c58-b08b-5e93fd627b09" containerName="controller-manager" containerID="cri-o://f128a3d4bdf0dacb8608a946cc835c24285d4dce07e464f4db315b09d4133386" gracePeriod=30 Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.134413 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d8d94fbf-wtrhr"] Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.134883 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-d8d94fbf-wtrhr" podUID="cbeb79c8-992a-461c-b02f-a58f5aaa31f4" containerName="route-controller-manager" containerID="cri-o://8795131432c16c78ba7b7af07154ca023720eefdd3be0674400f9b82fc9aa93d" gracePeriod=30 Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.146704 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e21f0772-4928-4b64-a6cf-640ecebe49d0-secret-volume\") pod \"collect-profiles-29537535-6xknn\" (UID: \"e21f0772-4928-4b64-a6cf-640ecebe49d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537535-6xknn" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.146901 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xxlk\" (UniqueName: \"kubernetes.io/projected/55524219-a6bb-4464-954e-2a8726c25a20-kube-api-access-5xxlk\") pod \"auto-csr-approver-29537536-n4b67\" (UID: \"55524219-a6bb-4464-954e-2a8726c25a20\") " pod="openshift-infra/auto-csr-approver-29537536-n4b67" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.147036 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e21f0772-4928-4b64-a6cf-640ecebe49d0-config-volume\") pod \"collect-profiles-29537535-6xknn\" (UID: \"e21f0772-4928-4b64-a6cf-640ecebe49d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537535-6xknn" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.147128 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bhww\" (UniqueName: \"kubernetes.io/projected/e21f0772-4928-4b64-a6cf-640ecebe49d0-kube-api-access-2bhww\") pod \"collect-profiles-29537535-6xknn\" (UID: \"e21f0772-4928-4b64-a6cf-640ecebe49d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537535-6xknn" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.148235 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e21f0772-4928-4b64-a6cf-640ecebe49d0-config-volume\") pod \"collect-profiles-29537535-6xknn\" (UID: \"e21f0772-4928-4b64-a6cf-640ecebe49d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537535-6xknn" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.159715 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e21f0772-4928-4b64-a6cf-640ecebe49d0-secret-volume\") pod \"collect-profiles-29537535-6xknn\" (UID: \"e21f0772-4928-4b64-a6cf-640ecebe49d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537535-6xknn" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.173954 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xxlk\" (UniqueName: \"kubernetes.io/projected/55524219-a6bb-4464-954e-2a8726c25a20-kube-api-access-5xxlk\") pod \"auto-csr-approver-29537536-n4b67\" (UID: \"55524219-a6bb-4464-954e-2a8726c25a20\") " pod="openshift-infra/auto-csr-approver-29537536-n4b67" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.185491 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bhww\" (UniqueName: \"kubernetes.io/projected/e21f0772-4928-4b64-a6cf-640ecebe49d0-kube-api-access-2bhww\") pod \"collect-profiles-29537535-6xknn\" (UID: \"e21f0772-4928-4b64-a6cf-640ecebe49d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537535-6xknn" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.280684 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537535-6xknn" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.288029 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537536-n4b67" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.719322 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d8d94fbf-wtrhr" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.769390 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cbeb79c8-992a-461c-b02f-a58f5aaa31f4-client-ca\") pod \"cbeb79c8-992a-461c-b02f-a58f5aaa31f4\" (UID: \"cbeb79c8-992a-461c-b02f-a58f5aaa31f4\") " Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.769511 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8c9q\" (UniqueName: \"kubernetes.io/projected/cbeb79c8-992a-461c-b02f-a58f5aaa31f4-kube-api-access-d8c9q\") pod \"cbeb79c8-992a-461c-b02f-a58f5aaa31f4\" (UID: \"cbeb79c8-992a-461c-b02f-a58f5aaa31f4\") " Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.769633 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbeb79c8-992a-461c-b02f-a58f5aaa31f4-config\") pod \"cbeb79c8-992a-461c-b02f-a58f5aaa31f4\" (UID: \"cbeb79c8-992a-461c-b02f-a58f5aaa31f4\") " Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.769754 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbeb79c8-992a-461c-b02f-a58f5aaa31f4-serving-cert\") pod \"cbeb79c8-992a-461c-b02f-a58f5aaa31f4\" (UID: \"cbeb79c8-992a-461c-b02f-a58f5aaa31f4\") " Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.770256 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbeb79c8-992a-461c-b02f-a58f5aaa31f4-client-ca" (OuterVolumeSpecName: "client-ca") pod "cbeb79c8-992a-461c-b02f-a58f5aaa31f4" (UID: "cbeb79c8-992a-461c-b02f-a58f5aaa31f4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.770773 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbeb79c8-992a-461c-b02f-a58f5aaa31f4-config" (OuterVolumeSpecName: "config") pod "cbeb79c8-992a-461c-b02f-a58f5aaa31f4" (UID: "cbeb79c8-992a-461c-b02f-a58f5aaa31f4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.773921 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbeb79c8-992a-461c-b02f-a58f5aaa31f4-kube-api-access-d8c9q" (OuterVolumeSpecName: "kube-api-access-d8c9q") pod "cbeb79c8-992a-461c-b02f-a58f5aaa31f4" (UID: "cbeb79c8-992a-461c-b02f-a58f5aaa31f4"). InnerVolumeSpecName "kube-api-access-d8c9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.776074 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbeb79c8-992a-461c-b02f-a58f5aaa31f4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cbeb79c8-992a-461c-b02f-a58f5aaa31f4" (UID: "cbeb79c8-992a-461c-b02f-a58f5aaa31f4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.823465 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78ff6764c-7x6gx" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.842207 5072 generic.go:334] "Generic (PLEG): container finished" podID="cbeb79c8-992a-461c-b02f-a58f5aaa31f4" containerID="8795131432c16c78ba7b7af07154ca023720eefdd3be0674400f9b82fc9aa93d" exitCode=0 Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.842273 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d8d94fbf-wtrhr" event={"ID":"cbeb79c8-992a-461c-b02f-a58f5aaa31f4","Type":"ContainerDied","Data":"8795131432c16c78ba7b7af07154ca023720eefdd3be0674400f9b82fc9aa93d"} Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.842302 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d8d94fbf-wtrhr" event={"ID":"cbeb79c8-992a-461c-b02f-a58f5aaa31f4","Type":"ContainerDied","Data":"c2e375b7fbac84ebe982e68569020470d2726511ba26039c486ec95178158f9b"} Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.842320 5072 scope.go:117] "RemoveContainer" containerID="8795131432c16c78ba7b7af07154ca023720eefdd3be0674400f9b82fc9aa93d" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.842482 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d8d94fbf-wtrhr" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.844699 5072 generic.go:334] "Generic (PLEG): container finished" podID="c5ebae06-49d0-4c58-b08b-5e93fd627b09" containerID="f128a3d4bdf0dacb8608a946cc835c24285d4dce07e464f4db315b09d4133386" exitCode=0 Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.844740 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78ff6764c-7x6gx" event={"ID":"c5ebae06-49d0-4c58-b08b-5e93fd627b09","Type":"ContainerDied","Data":"f128a3d4bdf0dacb8608a946cc835c24285d4dce07e464f4db315b09d4133386"} Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.844767 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78ff6764c-7x6gx" event={"ID":"c5ebae06-49d0-4c58-b08b-5e93fd627b09","Type":"ContainerDied","Data":"f38f994e566b408766e6046b9ebd557dde9d3e894eaa61c61c04723164627834"} Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.844763 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78ff6764c-7x6gx" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.865383 5072 scope.go:117] "RemoveContainer" containerID="8795131432c16c78ba7b7af07154ca023720eefdd3be0674400f9b82fc9aa93d" Feb 28 04:16:08 crc kubenswrapper[5072]: E0228 04:16:08.867838 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8795131432c16c78ba7b7af07154ca023720eefdd3be0674400f9b82fc9aa93d\": container with ID starting with 8795131432c16c78ba7b7af07154ca023720eefdd3be0674400f9b82fc9aa93d not found: ID does not exist" containerID="8795131432c16c78ba7b7af07154ca023720eefdd3be0674400f9b82fc9aa93d" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.867885 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8795131432c16c78ba7b7af07154ca023720eefdd3be0674400f9b82fc9aa93d"} err="failed to get container status \"8795131432c16c78ba7b7af07154ca023720eefdd3be0674400f9b82fc9aa93d\": rpc error: code = NotFound desc = could not find container \"8795131432c16c78ba7b7af07154ca023720eefdd3be0674400f9b82fc9aa93d\": container with ID starting with 8795131432c16c78ba7b7af07154ca023720eefdd3be0674400f9b82fc9aa93d not found: ID does not exist" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.867920 5072 scope.go:117] "RemoveContainer" containerID="f128a3d4bdf0dacb8608a946cc835c24285d4dce07e464f4db315b09d4133386" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.873954 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c5ebae06-49d0-4c58-b08b-5e93fd627b09-proxy-ca-bundles\") pod \"c5ebae06-49d0-4c58-b08b-5e93fd627b09\" (UID: \"c5ebae06-49d0-4c58-b08b-5e93fd627b09\") " Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.874081 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5ebae06-49d0-4c58-b08b-5e93fd627b09-config\") pod \"c5ebae06-49d0-4c58-b08b-5e93fd627b09\" (UID: \"c5ebae06-49d0-4c58-b08b-5e93fd627b09\") " Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.874131 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5ebae06-49d0-4c58-b08b-5e93fd627b09-client-ca\") pod \"c5ebae06-49d0-4c58-b08b-5e93fd627b09\" (UID: \"c5ebae06-49d0-4c58-b08b-5e93fd627b09\") " Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.874184 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5ebae06-49d0-4c58-b08b-5e93fd627b09-serving-cert\") pod \"c5ebae06-49d0-4c58-b08b-5e93fd627b09\" (UID: \"c5ebae06-49d0-4c58-b08b-5e93fd627b09\") " Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.874233 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t8l9\" (UniqueName: \"kubernetes.io/projected/c5ebae06-49d0-4c58-b08b-5e93fd627b09-kube-api-access-9t8l9\") pod \"c5ebae06-49d0-4c58-b08b-5e93fd627b09\" (UID: \"c5ebae06-49d0-4c58-b08b-5e93fd627b09\") " Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.874512 5072 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbeb79c8-992a-461c-b02f-a58f5aaa31f4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.874538 5072 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cbeb79c8-992a-461c-b02f-a58f5aaa31f4-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.874554 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8c9q\" (UniqueName: \"kubernetes.io/projected/cbeb79c8-992a-461c-b02f-a58f5aaa31f4-kube-api-access-d8c9q\") on node \"crc\" DevicePath \"\"" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.874568 5072 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbeb79c8-992a-461c-b02f-a58f5aaa31f4-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.875086 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5ebae06-49d0-4c58-b08b-5e93fd627b09-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c5ebae06-49d0-4c58-b08b-5e93fd627b09" (UID: "c5ebae06-49d0-4c58-b08b-5e93fd627b09"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.875408 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5ebae06-49d0-4c58-b08b-5e93fd627b09-client-ca" (OuterVolumeSpecName: "client-ca") pod "c5ebae06-49d0-4c58-b08b-5e93fd627b09" (UID: "c5ebae06-49d0-4c58-b08b-5e93fd627b09"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.875800 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5ebae06-49d0-4c58-b08b-5e93fd627b09-config" (OuterVolumeSpecName: "config") pod "c5ebae06-49d0-4c58-b08b-5e93fd627b09" (UID: "c5ebae06-49d0-4c58-b08b-5e93fd627b09"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.884288 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5ebae06-49d0-4c58-b08b-5e93fd627b09-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c5ebae06-49d0-4c58-b08b-5e93fd627b09" (UID: "c5ebae06-49d0-4c58-b08b-5e93fd627b09"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.884323 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5ebae06-49d0-4c58-b08b-5e93fd627b09-kube-api-access-9t8l9" (OuterVolumeSpecName: "kube-api-access-9t8l9") pod "c5ebae06-49d0-4c58-b08b-5e93fd627b09" (UID: "c5ebae06-49d0-4c58-b08b-5e93fd627b09"). InnerVolumeSpecName "kube-api-access-9t8l9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.891724 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d8d94fbf-wtrhr"] Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.892963 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d8d94fbf-wtrhr"] Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.896855 5072 scope.go:117] "RemoveContainer" containerID="f128a3d4bdf0dacb8608a946cc835c24285d4dce07e464f4db315b09d4133386" Feb 28 04:16:08 crc kubenswrapper[5072]: E0228 04:16:08.900139 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f128a3d4bdf0dacb8608a946cc835c24285d4dce07e464f4db315b09d4133386\": container with ID starting with f128a3d4bdf0dacb8608a946cc835c24285d4dce07e464f4db315b09d4133386 not found: ID does not exist" containerID="f128a3d4bdf0dacb8608a946cc835c24285d4dce07e464f4db315b09d4133386" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.900191 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f128a3d4bdf0dacb8608a946cc835c24285d4dce07e464f4db315b09d4133386"} err="failed to get container status \"f128a3d4bdf0dacb8608a946cc835c24285d4dce07e464f4db315b09d4133386\": rpc error: code = NotFound desc = could not find container \"f128a3d4bdf0dacb8608a946cc835c24285d4dce07e464f4db315b09d4133386\": container with ID starting with f128a3d4bdf0dacb8608a946cc835c24285d4dce07e464f4db315b09d4133386 not found: ID does not exist" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.969206 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537535-6xknn"] Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.980917 5072 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c5ebae06-49d0-4c58-b08b-5e93fd627b09-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.980955 5072 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5ebae06-49d0-4c58-b08b-5e93fd627b09-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.980971 5072 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5ebae06-49d0-4c58-b08b-5e93fd627b09-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.980983 5072 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5ebae06-49d0-4c58-b08b-5e93fd627b09-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:16:08 crc kubenswrapper[5072]: I0228 04:16:08.980996 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t8l9\" (UniqueName: \"kubernetes.io/projected/c5ebae06-49d0-4c58-b08b-5e93fd627b09-kube-api-access-9t8l9\") on node \"crc\" DevicePath \"\"" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.011798 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537536-n4b67"] Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.176191 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-78ff6764c-7x6gx"] Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.176665 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-78ff6764c-7x6gx"] Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.463965 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.493189 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-76bbb855b9-29jlc"] Feb 28 04:16:09 crc kubenswrapper[5072]: E0228 04:16:09.493444 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5ebae06-49d0-4c58-b08b-5e93fd627b09" containerName="controller-manager" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.493460 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5ebae06-49d0-4c58-b08b-5e93fd627b09" containerName="controller-manager" Feb 28 04:16:09 crc kubenswrapper[5072]: E0228 04:16:09.493475 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbeb79c8-992a-461c-b02f-a58f5aaa31f4" containerName="route-controller-manager" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.493483 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbeb79c8-992a-461c-b02f-a58f5aaa31f4" containerName="route-controller-manager" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.493620 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5ebae06-49d0-4c58-b08b-5e93fd627b09" containerName="controller-manager" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.493651 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbeb79c8-992a-461c-b02f-a58f5aaa31f4" containerName="route-controller-manager" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.494069 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76bbb855b9-29jlc" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.499081 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.499240 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.499326 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.499455 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.499462 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.499783 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.503445 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5598468cdf-wmhst"] Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.504014 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.504252 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5598468cdf-wmhst" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.507772 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.507849 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.507870 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.507956 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.507964 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.510531 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.511723 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76bbb855b9-29jlc"] Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.514873 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5598468cdf-wmhst"] Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.588748 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqmrm\" (UniqueName: \"kubernetes.io/projected/3f60e1fb-dc64-426f-be47-dbb31f745383-kube-api-access-jqmrm\") pod \"controller-manager-76bbb855b9-29jlc\" (UID: \"3f60e1fb-dc64-426f-be47-dbb31f745383\") " pod="openshift-controller-manager/controller-manager-76bbb855b9-29jlc" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.588809 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f60e1fb-dc64-426f-be47-dbb31f745383-proxy-ca-bundles\") pod \"controller-manager-76bbb855b9-29jlc\" (UID: \"3f60e1fb-dc64-426f-be47-dbb31f745383\") " pod="openshift-controller-manager/controller-manager-76bbb855b9-29jlc" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.588892 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f60e1fb-dc64-426f-be47-dbb31f745383-serving-cert\") pod \"controller-manager-76bbb855b9-29jlc\" (UID: \"3f60e1fb-dc64-426f-be47-dbb31f745383\") " pod="openshift-controller-manager/controller-manager-76bbb855b9-29jlc" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.588923 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f60e1fb-dc64-426f-be47-dbb31f745383-config\") pod \"controller-manager-76bbb855b9-29jlc\" (UID: \"3f60e1fb-dc64-426f-be47-dbb31f745383\") " pod="openshift-controller-manager/controller-manager-76bbb855b9-29jlc" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.588948 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f79e8d0f-72bc-4bb6-a74b-63cad57fb129-client-ca\") pod \"route-controller-manager-5598468cdf-wmhst\" (UID: \"f79e8d0f-72bc-4bb6-a74b-63cad57fb129\") " pod="openshift-route-controller-manager/route-controller-manager-5598468cdf-wmhst" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.588969 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbvfv\" (UniqueName: \"kubernetes.io/projected/f79e8d0f-72bc-4bb6-a74b-63cad57fb129-kube-api-access-dbvfv\") pod \"route-controller-manager-5598468cdf-wmhst\" (UID: \"f79e8d0f-72bc-4bb6-a74b-63cad57fb129\") " pod="openshift-route-controller-manager/route-controller-manager-5598468cdf-wmhst" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.589017 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f79e8d0f-72bc-4bb6-a74b-63cad57fb129-config\") pod \"route-controller-manager-5598468cdf-wmhst\" (UID: \"f79e8d0f-72bc-4bb6-a74b-63cad57fb129\") " pod="openshift-route-controller-manager/route-controller-manager-5598468cdf-wmhst" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.589046 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f79e8d0f-72bc-4bb6-a74b-63cad57fb129-serving-cert\") pod \"route-controller-manager-5598468cdf-wmhst\" (UID: \"f79e8d0f-72bc-4bb6-a74b-63cad57fb129\") " pod="openshift-route-controller-manager/route-controller-manager-5598468cdf-wmhst" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.589068 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f60e1fb-dc64-426f-be47-dbb31f745383-client-ca\") pod \"controller-manager-76bbb855b9-29jlc\" (UID: \"3f60e1fb-dc64-426f-be47-dbb31f745383\") " pod="openshift-controller-manager/controller-manager-76bbb855b9-29jlc" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.690495 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqmrm\" (UniqueName: \"kubernetes.io/projected/3f60e1fb-dc64-426f-be47-dbb31f745383-kube-api-access-jqmrm\") pod \"controller-manager-76bbb855b9-29jlc\" (UID: \"3f60e1fb-dc64-426f-be47-dbb31f745383\") " pod="openshift-controller-manager/controller-manager-76bbb855b9-29jlc" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.690539 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f60e1fb-dc64-426f-be47-dbb31f745383-proxy-ca-bundles\") pod \"controller-manager-76bbb855b9-29jlc\" (UID: \"3f60e1fb-dc64-426f-be47-dbb31f745383\") " pod="openshift-controller-manager/controller-manager-76bbb855b9-29jlc" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.690585 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f60e1fb-dc64-426f-be47-dbb31f745383-serving-cert\") pod \"controller-manager-76bbb855b9-29jlc\" (UID: \"3f60e1fb-dc64-426f-be47-dbb31f745383\") " pod="openshift-controller-manager/controller-manager-76bbb855b9-29jlc" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.690612 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f60e1fb-dc64-426f-be47-dbb31f745383-config\") pod \"controller-manager-76bbb855b9-29jlc\" (UID: \"3f60e1fb-dc64-426f-be47-dbb31f745383\") " pod="openshift-controller-manager/controller-manager-76bbb855b9-29jlc" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.690633 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f79e8d0f-72bc-4bb6-a74b-63cad57fb129-client-ca\") pod \"route-controller-manager-5598468cdf-wmhst\" (UID: \"f79e8d0f-72bc-4bb6-a74b-63cad57fb129\") " pod="openshift-route-controller-manager/route-controller-manager-5598468cdf-wmhst" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.690661 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbvfv\" (UniqueName: \"kubernetes.io/projected/f79e8d0f-72bc-4bb6-a74b-63cad57fb129-kube-api-access-dbvfv\") pod \"route-controller-manager-5598468cdf-wmhst\" (UID: \"f79e8d0f-72bc-4bb6-a74b-63cad57fb129\") " pod="openshift-route-controller-manager/route-controller-manager-5598468cdf-wmhst" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.690687 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f79e8d0f-72bc-4bb6-a74b-63cad57fb129-config\") pod \"route-controller-manager-5598468cdf-wmhst\" (UID: \"f79e8d0f-72bc-4bb6-a74b-63cad57fb129\") " pod="openshift-route-controller-manager/route-controller-manager-5598468cdf-wmhst" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.690703 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f79e8d0f-72bc-4bb6-a74b-63cad57fb129-serving-cert\") pod \"route-controller-manager-5598468cdf-wmhst\" (UID: \"f79e8d0f-72bc-4bb6-a74b-63cad57fb129\") " pod="openshift-route-controller-manager/route-controller-manager-5598468cdf-wmhst" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.690720 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f60e1fb-dc64-426f-be47-dbb31f745383-client-ca\") pod \"controller-manager-76bbb855b9-29jlc\" (UID: \"3f60e1fb-dc64-426f-be47-dbb31f745383\") " pod="openshift-controller-manager/controller-manager-76bbb855b9-29jlc" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.691565 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f60e1fb-dc64-426f-be47-dbb31f745383-client-ca\") pod \"controller-manager-76bbb855b9-29jlc\" (UID: \"3f60e1fb-dc64-426f-be47-dbb31f745383\") " pod="openshift-controller-manager/controller-manager-76bbb855b9-29jlc" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.692469 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f60e1fb-dc64-426f-be47-dbb31f745383-proxy-ca-bundles\") pod \"controller-manager-76bbb855b9-29jlc\" (UID: \"3f60e1fb-dc64-426f-be47-dbb31f745383\") " pod="openshift-controller-manager/controller-manager-76bbb855b9-29jlc" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.692495 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f79e8d0f-72bc-4bb6-a74b-63cad57fb129-config\") pod \"route-controller-manager-5598468cdf-wmhst\" (UID: \"f79e8d0f-72bc-4bb6-a74b-63cad57fb129\") " pod="openshift-route-controller-manager/route-controller-manager-5598468cdf-wmhst" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.693092 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f60e1fb-dc64-426f-be47-dbb31f745383-config\") pod \"controller-manager-76bbb855b9-29jlc\" (UID: \"3f60e1fb-dc64-426f-be47-dbb31f745383\") " pod="openshift-controller-manager/controller-manager-76bbb855b9-29jlc" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.693136 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f79e8d0f-72bc-4bb6-a74b-63cad57fb129-client-ca\") pod \"route-controller-manager-5598468cdf-wmhst\" (UID: \"f79e8d0f-72bc-4bb6-a74b-63cad57fb129\") " pod="openshift-route-controller-manager/route-controller-manager-5598468cdf-wmhst" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.696664 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f60e1fb-dc64-426f-be47-dbb31f745383-serving-cert\") pod \"controller-manager-76bbb855b9-29jlc\" (UID: \"3f60e1fb-dc64-426f-be47-dbb31f745383\") " pod="openshift-controller-manager/controller-manager-76bbb855b9-29jlc" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.697606 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f79e8d0f-72bc-4bb6-a74b-63cad57fb129-serving-cert\") pod \"route-controller-manager-5598468cdf-wmhst\" (UID: \"f79e8d0f-72bc-4bb6-a74b-63cad57fb129\") " pod="openshift-route-controller-manager/route-controller-manager-5598468cdf-wmhst" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.711062 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqmrm\" (UniqueName: \"kubernetes.io/projected/3f60e1fb-dc64-426f-be47-dbb31f745383-kube-api-access-jqmrm\") pod \"controller-manager-76bbb855b9-29jlc\" (UID: \"3f60e1fb-dc64-426f-be47-dbb31f745383\") " pod="openshift-controller-manager/controller-manager-76bbb855b9-29jlc" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.711188 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbvfv\" (UniqueName: \"kubernetes.io/projected/f79e8d0f-72bc-4bb6-a74b-63cad57fb129-kube-api-access-dbvfv\") pod \"route-controller-manager-5598468cdf-wmhst\" (UID: \"f79e8d0f-72bc-4bb6-a74b-63cad57fb129\") " pod="openshift-route-controller-manager/route-controller-manager-5598468cdf-wmhst" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.811750 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76bbb855b9-29jlc" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.824597 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5598468cdf-wmhst" Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.858731 5072 generic.go:334] "Generic (PLEG): container finished" podID="e21f0772-4928-4b64-a6cf-640ecebe49d0" containerID="2b5291510582226c3e91c5408aca0513dd4fe0002aadb171c9eca72e0cbb0a48" exitCode=0 Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.859225 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537535-6xknn" event={"ID":"e21f0772-4928-4b64-a6cf-640ecebe49d0","Type":"ContainerDied","Data":"2b5291510582226c3e91c5408aca0513dd4fe0002aadb171c9eca72e0cbb0a48"} Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.859271 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537535-6xknn" event={"ID":"e21f0772-4928-4b64-a6cf-640ecebe49d0","Type":"ContainerStarted","Data":"e9348fb880acaa6b57b02f0cd6ef86bb8d3d5c732e6af1036553d644c97fe5f2"} Feb 28 04:16:09 crc kubenswrapper[5072]: I0228 04:16:09.864339 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537536-n4b67" event={"ID":"55524219-a6bb-4464-954e-2a8726c25a20","Type":"ContainerStarted","Data":"844721b479f72113d4af25c1c00743c51081507db6da1eb6d91ddef134415ee4"} Feb 28 04:16:10 crc kubenswrapper[5072]: I0228 04:16:10.074062 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5598468cdf-wmhst"] Feb 28 04:16:10 crc kubenswrapper[5072]: I0228 04:16:10.130848 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76bbb855b9-29jlc"] Feb 28 04:16:10 crc kubenswrapper[5072]: W0228 04:16:10.134556 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f60e1fb_dc64_426f_be47_dbb31f745383.slice/crio-326ff66c5a73adb5178b664f671542a4414f000ea00c2fafbc4f00416619a0d1 WatchSource:0}: Error finding container 326ff66c5a73adb5178b664f671542a4414f000ea00c2fafbc4f00416619a0d1: Status 404 returned error can't find the container with id 326ff66c5a73adb5178b664f671542a4414f000ea00c2fafbc4f00416619a0d1 Feb 28 04:16:10 crc kubenswrapper[5072]: I0228 04:16:10.667138 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5ebae06-49d0-4c58-b08b-5e93fd627b09" path="/var/lib/kubelet/pods/c5ebae06-49d0-4c58-b08b-5e93fd627b09/volumes" Feb 28 04:16:10 crc kubenswrapper[5072]: I0228 04:16:10.668121 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbeb79c8-992a-461c-b02f-a58f5aaa31f4" path="/var/lib/kubelet/pods/cbeb79c8-992a-461c-b02f-a58f5aaa31f4/volumes" Feb 28 04:16:10 crc kubenswrapper[5072]: I0228 04:16:10.873177 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76bbb855b9-29jlc" event={"ID":"3f60e1fb-dc64-426f-be47-dbb31f745383","Type":"ContainerStarted","Data":"a1b195e928bc6f32c3ef65b41aa49c5a00484e652457582d6e5b84caf90bc71e"} Feb 28 04:16:10 crc kubenswrapper[5072]: I0228 04:16:10.873592 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76bbb855b9-29jlc" event={"ID":"3f60e1fb-dc64-426f-be47-dbb31f745383","Type":"ContainerStarted","Data":"326ff66c5a73adb5178b664f671542a4414f000ea00c2fafbc4f00416619a0d1"} Feb 28 04:16:10 crc kubenswrapper[5072]: I0228 04:16:10.873620 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-76bbb855b9-29jlc" Feb 28 04:16:10 crc kubenswrapper[5072]: I0228 04:16:10.875995 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537536-n4b67" event={"ID":"55524219-a6bb-4464-954e-2a8726c25a20","Type":"ContainerStarted","Data":"621788dbc76b372a609765bf1a78d22c609f24c553336f73fcd9ac806eaba91f"} Feb 28 04:16:10 crc kubenswrapper[5072]: I0228 04:16:10.880656 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5598468cdf-wmhst" event={"ID":"f79e8d0f-72bc-4bb6-a74b-63cad57fb129","Type":"ContainerStarted","Data":"0ffa773f52150f9c2dff8d2f820f53f4b17a19cf76cf7313d8ddc158c01ea603"} Feb 28 04:16:10 crc kubenswrapper[5072]: I0228 04:16:10.880722 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5598468cdf-wmhst" event={"ID":"f79e8d0f-72bc-4bb6-a74b-63cad57fb129","Type":"ContainerStarted","Data":"d34b529ef4640e65c2de024bf3bf595b770843a85d07494f9ca072a8e7ac4c6a"} Feb 28 04:16:10 crc kubenswrapper[5072]: I0228 04:16:10.882368 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-76bbb855b9-29jlc" Feb 28 04:16:10 crc kubenswrapper[5072]: I0228 04:16:10.930899 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-76bbb855b9-29jlc" podStartSLOduration=2.9308692880000002 podStartE2EDuration="2.930869288s" podCreationTimestamp="2026-02-28 04:16:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:16:10.930152545 +0000 UTC m=+392.924882737" watchObservedRunningTime="2026-02-28 04:16:10.930869288 +0000 UTC m=+392.925599480" Feb 28 04:16:11 crc kubenswrapper[5072]: I0228 04:16:11.008180 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5598468cdf-wmhst" podStartSLOduration=3.008164239 podStartE2EDuration="3.008164239s" podCreationTimestamp="2026-02-28 04:16:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:16:11.005048192 +0000 UTC m=+392.999778384" watchObservedRunningTime="2026-02-28 04:16:11.008164239 +0000 UTC m=+393.002894431" Feb 28 04:16:11 crc kubenswrapper[5072]: I0228 04:16:11.030480 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29537536-n4b67" podStartSLOduration=3.010869095 podStartE2EDuration="4.030462261s" podCreationTimestamp="2026-02-28 04:16:07 +0000 UTC" firstStartedPulling="2026-02-28 04:16:09.037552087 +0000 UTC m=+391.032282279" lastFinishedPulling="2026-02-28 04:16:10.057145253 +0000 UTC m=+392.051875445" observedRunningTime="2026-02-28 04:16:11.026152667 +0000 UTC m=+393.020882859" watchObservedRunningTime="2026-02-28 04:16:11.030462261 +0000 UTC m=+393.025192453" Feb 28 04:16:11 crc kubenswrapper[5072]: I0228 04:16:11.225895 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537535-6xknn" Feb 28 04:16:11 crc kubenswrapper[5072]: I0228 04:16:11.311760 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e21f0772-4928-4b64-a6cf-640ecebe49d0-config-volume\") pod \"e21f0772-4928-4b64-a6cf-640ecebe49d0\" (UID: \"e21f0772-4928-4b64-a6cf-640ecebe49d0\") " Feb 28 04:16:11 crc kubenswrapper[5072]: I0228 04:16:11.311958 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e21f0772-4928-4b64-a6cf-640ecebe49d0-secret-volume\") pod \"e21f0772-4928-4b64-a6cf-640ecebe49d0\" (UID: \"e21f0772-4928-4b64-a6cf-640ecebe49d0\") " Feb 28 04:16:11 crc kubenswrapper[5072]: I0228 04:16:11.312033 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bhww\" (UniqueName: \"kubernetes.io/projected/e21f0772-4928-4b64-a6cf-640ecebe49d0-kube-api-access-2bhww\") pod \"e21f0772-4928-4b64-a6cf-640ecebe49d0\" (UID: \"e21f0772-4928-4b64-a6cf-640ecebe49d0\") " Feb 28 04:16:11 crc kubenswrapper[5072]: I0228 04:16:11.312694 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e21f0772-4928-4b64-a6cf-640ecebe49d0-config-volume" (OuterVolumeSpecName: "config-volume") pod "e21f0772-4928-4b64-a6cf-640ecebe49d0" (UID: "e21f0772-4928-4b64-a6cf-640ecebe49d0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:16:11 crc kubenswrapper[5072]: I0228 04:16:11.319413 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e21f0772-4928-4b64-a6cf-640ecebe49d0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e21f0772-4928-4b64-a6cf-640ecebe49d0" (UID: "e21f0772-4928-4b64-a6cf-640ecebe49d0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:16:11 crc kubenswrapper[5072]: I0228 04:16:11.336602 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e21f0772-4928-4b64-a6cf-640ecebe49d0-kube-api-access-2bhww" (OuterVolumeSpecName: "kube-api-access-2bhww") pod "e21f0772-4928-4b64-a6cf-640ecebe49d0" (UID: "e21f0772-4928-4b64-a6cf-640ecebe49d0"). InnerVolumeSpecName "kube-api-access-2bhww". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:16:11 crc kubenswrapper[5072]: I0228 04:16:11.413742 5072 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e21f0772-4928-4b64-a6cf-640ecebe49d0-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 28 04:16:11 crc kubenswrapper[5072]: I0228 04:16:11.414088 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bhww\" (UniqueName: \"kubernetes.io/projected/e21f0772-4928-4b64-a6cf-640ecebe49d0-kube-api-access-2bhww\") on node \"crc\" DevicePath \"\"" Feb 28 04:16:11 crc kubenswrapper[5072]: I0228 04:16:11.414100 5072 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e21f0772-4928-4b64-a6cf-640ecebe49d0-config-volume\") on node \"crc\" DevicePath \"\"" Feb 28 04:16:11 crc kubenswrapper[5072]: I0228 04:16:11.888717 5072 generic.go:334] "Generic (PLEG): container finished" podID="55524219-a6bb-4464-954e-2a8726c25a20" containerID="621788dbc76b372a609765bf1a78d22c609f24c553336f73fcd9ac806eaba91f" exitCode=0 Feb 28 04:16:11 crc kubenswrapper[5072]: I0228 04:16:11.888825 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537536-n4b67" event={"ID":"55524219-a6bb-4464-954e-2a8726c25a20","Type":"ContainerDied","Data":"621788dbc76b372a609765bf1a78d22c609f24c553336f73fcd9ac806eaba91f"} Feb 28 04:16:11 crc kubenswrapper[5072]: I0228 04:16:11.890822 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 28 04:16:11 crc kubenswrapper[5072]: I0228 04:16:11.892549 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 28 04:16:11 crc kubenswrapper[5072]: I0228 04:16:11.893191 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 28 04:16:11 crc kubenswrapper[5072]: I0228 04:16:11.893247 5072 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c2c0a2ae35ad58e724f38ed0ce815f9d3d1ae4530efe71f93ac13ba9dc5230c0" exitCode=137 Feb 28 04:16:11 crc kubenswrapper[5072]: I0228 04:16:11.893295 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c2c0a2ae35ad58e724f38ed0ce815f9d3d1ae4530efe71f93ac13ba9dc5230c0"} Feb 28 04:16:11 crc kubenswrapper[5072]: I0228 04:16:11.893372 5072 scope.go:117] "RemoveContainer" containerID="c229773678d8bee8898338eab595e8a47dba68ae424e54532ffc8eab7f077005" Feb 28 04:16:11 crc kubenswrapper[5072]: I0228 04:16:11.895804 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537535-6xknn" event={"ID":"e21f0772-4928-4b64-a6cf-640ecebe49d0","Type":"ContainerDied","Data":"e9348fb880acaa6b57b02f0cd6ef86bb8d3d5c732e6af1036553d644c97fe5f2"} Feb 28 04:16:11 crc kubenswrapper[5072]: I0228 04:16:11.895830 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9348fb880acaa6b57b02f0cd6ef86bb8d3d5c732e6af1036553d644c97fe5f2" Feb 28 04:16:11 crc kubenswrapper[5072]: I0228 04:16:11.895883 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537535-6xknn" Feb 28 04:16:11 crc kubenswrapper[5072]: I0228 04:16:11.896470 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5598468cdf-wmhst" Feb 28 04:16:11 crc kubenswrapper[5072]: I0228 04:16:11.903663 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5598468cdf-wmhst" Feb 28 04:16:12 crc kubenswrapper[5072]: I0228 04:16:12.905446 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 28 04:16:12 crc kubenswrapper[5072]: I0228 04:16:12.907170 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 28 04:16:12 crc kubenswrapper[5072]: I0228 04:16:12.907277 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"dc9a316bbbf5e5d17d4a5d00a7c6577e7351a34d385e72c1de28ce32ce34f1fe"} Feb 28 04:16:13 crc kubenswrapper[5072]: I0228 04:16:13.237236 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537536-n4b67" Feb 28 04:16:13 crc kubenswrapper[5072]: I0228 04:16:13.350394 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xxlk\" (UniqueName: \"kubernetes.io/projected/55524219-a6bb-4464-954e-2a8726c25a20-kube-api-access-5xxlk\") pod \"55524219-a6bb-4464-954e-2a8726c25a20\" (UID: \"55524219-a6bb-4464-954e-2a8726c25a20\") " Feb 28 04:16:13 crc kubenswrapper[5072]: I0228 04:16:13.357021 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55524219-a6bb-4464-954e-2a8726c25a20-kube-api-access-5xxlk" (OuterVolumeSpecName: "kube-api-access-5xxlk") pod "55524219-a6bb-4464-954e-2a8726c25a20" (UID: "55524219-a6bb-4464-954e-2a8726c25a20"). InnerVolumeSpecName "kube-api-access-5xxlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:16:13 crc kubenswrapper[5072]: I0228 04:16:13.452816 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xxlk\" (UniqueName: \"kubernetes.io/projected/55524219-a6bb-4464-954e-2a8726c25a20-kube-api-access-5xxlk\") on node \"crc\" DevicePath \"\"" Feb 28 04:16:13 crc kubenswrapper[5072]: I0228 04:16:13.469901 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 28 04:16:13 crc kubenswrapper[5072]: I0228 04:16:13.914246 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537536-n4b67" event={"ID":"55524219-a6bb-4464-954e-2a8726c25a20","Type":"ContainerDied","Data":"844721b479f72113d4af25c1c00743c51081507db6da1eb6d91ddef134415ee4"} Feb 28 04:16:13 crc kubenswrapper[5072]: I0228 04:16:13.914700 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="844721b479f72113d4af25c1c00743c51081507db6da1eb6d91ddef134415ee4" Feb 28 04:16:13 crc kubenswrapper[5072]: I0228 04:16:13.914303 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537536-n4b67" Feb 28 04:16:14 crc kubenswrapper[5072]: I0228 04:16:14.469985 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 28 04:16:17 crc kubenswrapper[5072]: I0228 04:16:17.995287 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 28 04:16:19 crc kubenswrapper[5072]: I0228 04:16:19.081960 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 28 04:16:20 crc kubenswrapper[5072]: I0228 04:16:20.561121 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 04:16:20 crc kubenswrapper[5072]: I0228 04:16:20.877348 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 28 04:16:21 crc kubenswrapper[5072]: I0228 04:16:21.405984 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 04:16:21 crc kubenswrapper[5072]: I0228 04:16:21.412062 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 04:16:25 crc kubenswrapper[5072]: I0228 04:16:25.773258 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 28 04:16:27 crc kubenswrapper[5072]: I0228 04:16:27.346187 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 28 04:16:27 crc kubenswrapper[5072]: I0228 04:16:27.360373 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 28 04:16:30 crc kubenswrapper[5072]: I0228 04:16:30.569787 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.047861 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-76bbb855b9-29jlc"] Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.048431 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-76bbb855b9-29jlc" podUID="3f60e1fb-dc64-426f-be47-dbb31f745383" containerName="controller-manager" containerID="cri-o://a1b195e928bc6f32c3ef65b41aa49c5a00484e652457582d6e5b84caf90bc71e" gracePeriod=30 Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.055507 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5598468cdf-wmhst"] Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.055806 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5598468cdf-wmhst" podUID="f79e8d0f-72bc-4bb6-a74b-63cad57fb129" containerName="route-controller-manager" containerID="cri-o://0ffa773f52150f9c2dff8d2f820f53f4b17a19cf76cf7313d8ddc158c01ea603" gracePeriod=30 Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.546898 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5598468cdf-wmhst" Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.677503 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76bbb855b9-29jlc" Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.714900 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f79e8d0f-72bc-4bb6-a74b-63cad57fb129-serving-cert\") pod \"f79e8d0f-72bc-4bb6-a74b-63cad57fb129\" (UID: \"f79e8d0f-72bc-4bb6-a74b-63cad57fb129\") " Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.714982 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f79e8d0f-72bc-4bb6-a74b-63cad57fb129-client-ca\") pod \"f79e8d0f-72bc-4bb6-a74b-63cad57fb129\" (UID: \"f79e8d0f-72bc-4bb6-a74b-63cad57fb129\") " Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.715014 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f79e8d0f-72bc-4bb6-a74b-63cad57fb129-config\") pod \"f79e8d0f-72bc-4bb6-a74b-63cad57fb129\" (UID: \"f79e8d0f-72bc-4bb6-a74b-63cad57fb129\") " Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.715043 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbvfv\" (UniqueName: \"kubernetes.io/projected/f79e8d0f-72bc-4bb6-a74b-63cad57fb129-kube-api-access-dbvfv\") pod \"f79e8d0f-72bc-4bb6-a74b-63cad57fb129\" (UID: \"f79e8d0f-72bc-4bb6-a74b-63cad57fb129\") " Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.716224 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f79e8d0f-72bc-4bb6-a74b-63cad57fb129-client-ca" (OuterVolumeSpecName: "client-ca") pod "f79e8d0f-72bc-4bb6-a74b-63cad57fb129" (UID: "f79e8d0f-72bc-4bb6-a74b-63cad57fb129"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.716606 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f79e8d0f-72bc-4bb6-a74b-63cad57fb129-config" (OuterVolumeSpecName: "config") pod "f79e8d0f-72bc-4bb6-a74b-63cad57fb129" (UID: "f79e8d0f-72bc-4bb6-a74b-63cad57fb129"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.721516 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f79e8d0f-72bc-4bb6-a74b-63cad57fb129-kube-api-access-dbvfv" (OuterVolumeSpecName: "kube-api-access-dbvfv") pod "f79e8d0f-72bc-4bb6-a74b-63cad57fb129" (UID: "f79e8d0f-72bc-4bb6-a74b-63cad57fb129"). InnerVolumeSpecName "kube-api-access-dbvfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.721790 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f79e8d0f-72bc-4bb6-a74b-63cad57fb129-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f79e8d0f-72bc-4bb6-a74b-63cad57fb129" (UID: "f79e8d0f-72bc-4bb6-a74b-63cad57fb129"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.815837 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqmrm\" (UniqueName: \"kubernetes.io/projected/3f60e1fb-dc64-426f-be47-dbb31f745383-kube-api-access-jqmrm\") pod \"3f60e1fb-dc64-426f-be47-dbb31f745383\" (UID: \"3f60e1fb-dc64-426f-be47-dbb31f745383\") " Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.815924 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f60e1fb-dc64-426f-be47-dbb31f745383-config\") pod \"3f60e1fb-dc64-426f-be47-dbb31f745383\" (UID: \"3f60e1fb-dc64-426f-be47-dbb31f745383\") " Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.815996 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f60e1fb-dc64-426f-be47-dbb31f745383-serving-cert\") pod \"3f60e1fb-dc64-426f-be47-dbb31f745383\" (UID: \"3f60e1fb-dc64-426f-be47-dbb31f745383\") " Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.816018 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f60e1fb-dc64-426f-be47-dbb31f745383-proxy-ca-bundles\") pod \"3f60e1fb-dc64-426f-be47-dbb31f745383\" (UID: \"3f60e1fb-dc64-426f-be47-dbb31f745383\") " Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.816052 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f60e1fb-dc64-426f-be47-dbb31f745383-client-ca\") pod \"3f60e1fb-dc64-426f-be47-dbb31f745383\" (UID: \"3f60e1fb-dc64-426f-be47-dbb31f745383\") " Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.816299 5072 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f79e8d0f-72bc-4bb6-a74b-63cad57fb129-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.816311 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbvfv\" (UniqueName: \"kubernetes.io/projected/f79e8d0f-72bc-4bb6-a74b-63cad57fb129-kube-api-access-dbvfv\") on node \"crc\" DevicePath \"\"" Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.816321 5072 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f79e8d0f-72bc-4bb6-a74b-63cad57fb129-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.816329 5072 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f79e8d0f-72bc-4bb6-a74b-63cad57fb129-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.816797 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f60e1fb-dc64-426f-be47-dbb31f745383-client-ca" (OuterVolumeSpecName: "client-ca") pod "3f60e1fb-dc64-426f-be47-dbb31f745383" (UID: "3f60e1fb-dc64-426f-be47-dbb31f745383"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.816808 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f60e1fb-dc64-426f-be47-dbb31f745383-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3f60e1fb-dc64-426f-be47-dbb31f745383" (UID: "3f60e1fb-dc64-426f-be47-dbb31f745383"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.816905 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f60e1fb-dc64-426f-be47-dbb31f745383-config" (OuterVolumeSpecName: "config") pod "3f60e1fb-dc64-426f-be47-dbb31f745383" (UID: "3f60e1fb-dc64-426f-be47-dbb31f745383"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.819443 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f60e1fb-dc64-426f-be47-dbb31f745383-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3f60e1fb-dc64-426f-be47-dbb31f745383" (UID: "3f60e1fb-dc64-426f-be47-dbb31f745383"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.819507 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f60e1fb-dc64-426f-be47-dbb31f745383-kube-api-access-jqmrm" (OuterVolumeSpecName: "kube-api-access-jqmrm") pod "3f60e1fb-dc64-426f-be47-dbb31f745383" (UID: "3f60e1fb-dc64-426f-be47-dbb31f745383"). InnerVolumeSpecName "kube-api-access-jqmrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.917256 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqmrm\" (UniqueName: \"kubernetes.io/projected/3f60e1fb-dc64-426f-be47-dbb31f745383-kube-api-access-jqmrm\") on node \"crc\" DevicePath \"\"" Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.917294 5072 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f60e1fb-dc64-426f-be47-dbb31f745383-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.917306 5072 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f60e1fb-dc64-426f-be47-dbb31f745383-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.917317 5072 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f60e1fb-dc64-426f-be47-dbb31f745383-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 28 04:16:32 crc kubenswrapper[5072]: I0228 04:16:32.917325 5072 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f60e1fb-dc64-426f-be47-dbb31f745383-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.033666 5072 generic.go:334] "Generic (PLEG): container finished" podID="f79e8d0f-72bc-4bb6-a74b-63cad57fb129" containerID="0ffa773f52150f9c2dff8d2f820f53f4b17a19cf76cf7313d8ddc158c01ea603" exitCode=0 Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.033732 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5598468cdf-wmhst" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.033755 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5598468cdf-wmhst" event={"ID":"f79e8d0f-72bc-4bb6-a74b-63cad57fb129","Type":"ContainerDied","Data":"0ffa773f52150f9c2dff8d2f820f53f4b17a19cf76cf7313d8ddc158c01ea603"} Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.033807 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5598468cdf-wmhst" event={"ID":"f79e8d0f-72bc-4bb6-a74b-63cad57fb129","Type":"ContainerDied","Data":"d34b529ef4640e65c2de024bf3bf595b770843a85d07494f9ca072a8e7ac4c6a"} Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.033828 5072 scope.go:117] "RemoveContainer" containerID="0ffa773f52150f9c2dff8d2f820f53f4b17a19cf76cf7313d8ddc158c01ea603" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.035503 5072 generic.go:334] "Generic (PLEG): container finished" podID="3f60e1fb-dc64-426f-be47-dbb31f745383" containerID="a1b195e928bc6f32c3ef65b41aa49c5a00484e652457582d6e5b84caf90bc71e" exitCode=0 Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.035575 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76bbb855b9-29jlc" event={"ID":"3f60e1fb-dc64-426f-be47-dbb31f745383","Type":"ContainerDied","Data":"a1b195e928bc6f32c3ef65b41aa49c5a00484e652457582d6e5b84caf90bc71e"} Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.035605 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76bbb855b9-29jlc" event={"ID":"3f60e1fb-dc64-426f-be47-dbb31f745383","Type":"ContainerDied","Data":"326ff66c5a73adb5178b664f671542a4414f000ea00c2fafbc4f00416619a0d1"} Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.035729 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76bbb855b9-29jlc" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.048655 5072 scope.go:117] "RemoveContainer" containerID="0ffa773f52150f9c2dff8d2f820f53f4b17a19cf76cf7313d8ddc158c01ea603" Feb 28 04:16:33 crc kubenswrapper[5072]: E0228 04:16:33.049360 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ffa773f52150f9c2dff8d2f820f53f4b17a19cf76cf7313d8ddc158c01ea603\": container with ID starting with 0ffa773f52150f9c2dff8d2f820f53f4b17a19cf76cf7313d8ddc158c01ea603 not found: ID does not exist" containerID="0ffa773f52150f9c2dff8d2f820f53f4b17a19cf76cf7313d8ddc158c01ea603" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.049411 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ffa773f52150f9c2dff8d2f820f53f4b17a19cf76cf7313d8ddc158c01ea603"} err="failed to get container status \"0ffa773f52150f9c2dff8d2f820f53f4b17a19cf76cf7313d8ddc158c01ea603\": rpc error: code = NotFound desc = could not find container \"0ffa773f52150f9c2dff8d2f820f53f4b17a19cf76cf7313d8ddc158c01ea603\": container with ID starting with 0ffa773f52150f9c2dff8d2f820f53f4b17a19cf76cf7313d8ddc158c01ea603 not found: ID does not exist" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.049435 5072 scope.go:117] "RemoveContainer" containerID="a1b195e928bc6f32c3ef65b41aa49c5a00484e652457582d6e5b84caf90bc71e" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.059842 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5598468cdf-wmhst"] Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.063735 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5598468cdf-wmhst"] Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.068925 5072 scope.go:117] "RemoveContainer" containerID="a1b195e928bc6f32c3ef65b41aa49c5a00484e652457582d6e5b84caf90bc71e" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.069577 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-76bbb855b9-29jlc"] Feb 28 04:16:33 crc kubenswrapper[5072]: E0228 04:16:33.070143 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1b195e928bc6f32c3ef65b41aa49c5a00484e652457582d6e5b84caf90bc71e\": container with ID starting with a1b195e928bc6f32c3ef65b41aa49c5a00484e652457582d6e5b84caf90bc71e not found: ID does not exist" containerID="a1b195e928bc6f32c3ef65b41aa49c5a00484e652457582d6e5b84caf90bc71e" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.070200 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1b195e928bc6f32c3ef65b41aa49c5a00484e652457582d6e5b84caf90bc71e"} err="failed to get container status \"a1b195e928bc6f32c3ef65b41aa49c5a00484e652457582d6e5b84caf90bc71e\": rpc error: code = NotFound desc = could not find container \"a1b195e928bc6f32c3ef65b41aa49c5a00484e652457582d6e5b84caf90bc71e\": container with ID starting with a1b195e928bc6f32c3ef65b41aa49c5a00484e652457582d6e5b84caf90bc71e not found: ID does not exist" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.073275 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-76bbb855b9-29jlc"] Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.516392 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-85c5bddd8b-x7wlb"] Feb 28 04:16:33 crc kubenswrapper[5072]: E0228 04:16:33.516690 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55524219-a6bb-4464-954e-2a8726c25a20" containerName="oc" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.516709 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="55524219-a6bb-4464-954e-2a8726c25a20" containerName="oc" Feb 28 04:16:33 crc kubenswrapper[5072]: E0228 04:16:33.516723 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f79e8d0f-72bc-4bb6-a74b-63cad57fb129" containerName="route-controller-manager" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.516731 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="f79e8d0f-72bc-4bb6-a74b-63cad57fb129" containerName="route-controller-manager" Feb 28 04:16:33 crc kubenswrapper[5072]: E0228 04:16:33.516745 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e21f0772-4928-4b64-a6cf-640ecebe49d0" containerName="collect-profiles" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.516753 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="e21f0772-4928-4b64-a6cf-640ecebe49d0" containerName="collect-profiles" Feb 28 04:16:33 crc kubenswrapper[5072]: E0228 04:16:33.516764 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f60e1fb-dc64-426f-be47-dbb31f745383" containerName="controller-manager" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.516772 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f60e1fb-dc64-426f-be47-dbb31f745383" containerName="controller-manager" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.516895 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="f79e8d0f-72bc-4bb6-a74b-63cad57fb129" containerName="route-controller-manager" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.516912 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="55524219-a6bb-4464-954e-2a8726c25a20" containerName="oc" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.516924 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f60e1fb-dc64-426f-be47-dbb31f745383" containerName="controller-manager" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.516938 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="e21f0772-4928-4b64-a6cf-640ecebe49d0" containerName="collect-profiles" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.517346 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85c5bddd8b-x7wlb" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.519043 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.519047 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.519736 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c5bcd9558-68fpj"] Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.519848 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.520124 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.520155 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.520168 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.520337 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c5bcd9558-68fpj" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.522903 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.522992 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.523105 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.523182 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.523277 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.523398 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.530730 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.531586 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-85c5bddd8b-x7wlb"] Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.548331 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c5bcd9558-68fpj"] Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.624999 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb2pm\" (UniqueName: \"kubernetes.io/projected/59e44cd9-beba-43f0-966d-32147af5d418-kube-api-access-zb2pm\") pod \"controller-manager-85c5bddd8b-x7wlb\" (UID: \"59e44cd9-beba-43f0-966d-32147af5d418\") " pod="openshift-controller-manager/controller-manager-85c5bddd8b-x7wlb" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.625050 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksbmk\" (UniqueName: \"kubernetes.io/projected/f226bc96-3265-4f70-8949-e0acba8ad2da-kube-api-access-ksbmk\") pod \"route-controller-manager-7c5bcd9558-68fpj\" (UID: \"f226bc96-3265-4f70-8949-e0acba8ad2da\") " pod="openshift-route-controller-manager/route-controller-manager-7c5bcd9558-68fpj" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.625074 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59e44cd9-beba-43f0-966d-32147af5d418-client-ca\") pod \"controller-manager-85c5bddd8b-x7wlb\" (UID: \"59e44cd9-beba-43f0-966d-32147af5d418\") " pod="openshift-controller-manager/controller-manager-85c5bddd8b-x7wlb" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.625174 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f226bc96-3265-4f70-8949-e0acba8ad2da-client-ca\") pod \"route-controller-manager-7c5bcd9558-68fpj\" (UID: \"f226bc96-3265-4f70-8949-e0acba8ad2da\") " pod="openshift-route-controller-manager/route-controller-manager-7c5bcd9558-68fpj" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.625228 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59e44cd9-beba-43f0-966d-32147af5d418-serving-cert\") pod \"controller-manager-85c5bddd8b-x7wlb\" (UID: \"59e44cd9-beba-43f0-966d-32147af5d418\") " pod="openshift-controller-manager/controller-manager-85c5bddd8b-x7wlb" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.625255 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f226bc96-3265-4f70-8949-e0acba8ad2da-serving-cert\") pod \"route-controller-manager-7c5bcd9558-68fpj\" (UID: \"f226bc96-3265-4f70-8949-e0acba8ad2da\") " pod="openshift-route-controller-manager/route-controller-manager-7c5bcd9558-68fpj" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.625317 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/59e44cd9-beba-43f0-966d-32147af5d418-proxy-ca-bundles\") pod \"controller-manager-85c5bddd8b-x7wlb\" (UID: \"59e44cd9-beba-43f0-966d-32147af5d418\") " pod="openshift-controller-manager/controller-manager-85c5bddd8b-x7wlb" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.625442 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f226bc96-3265-4f70-8949-e0acba8ad2da-config\") pod \"route-controller-manager-7c5bcd9558-68fpj\" (UID: \"f226bc96-3265-4f70-8949-e0acba8ad2da\") " pod="openshift-route-controller-manager/route-controller-manager-7c5bcd9558-68fpj" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.625475 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59e44cd9-beba-43f0-966d-32147af5d418-config\") pod \"controller-manager-85c5bddd8b-x7wlb\" (UID: \"59e44cd9-beba-43f0-966d-32147af5d418\") " pod="openshift-controller-manager/controller-manager-85c5bddd8b-x7wlb" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.726502 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb2pm\" (UniqueName: \"kubernetes.io/projected/59e44cd9-beba-43f0-966d-32147af5d418-kube-api-access-zb2pm\") pod \"controller-manager-85c5bddd8b-x7wlb\" (UID: \"59e44cd9-beba-43f0-966d-32147af5d418\") " pod="openshift-controller-manager/controller-manager-85c5bddd8b-x7wlb" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.726548 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksbmk\" (UniqueName: \"kubernetes.io/projected/f226bc96-3265-4f70-8949-e0acba8ad2da-kube-api-access-ksbmk\") pod \"route-controller-manager-7c5bcd9558-68fpj\" (UID: \"f226bc96-3265-4f70-8949-e0acba8ad2da\") " pod="openshift-route-controller-manager/route-controller-manager-7c5bcd9558-68fpj" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.726570 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59e44cd9-beba-43f0-966d-32147af5d418-client-ca\") pod \"controller-manager-85c5bddd8b-x7wlb\" (UID: \"59e44cd9-beba-43f0-966d-32147af5d418\") " pod="openshift-controller-manager/controller-manager-85c5bddd8b-x7wlb" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.726594 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f226bc96-3265-4f70-8949-e0acba8ad2da-client-ca\") pod \"route-controller-manager-7c5bcd9558-68fpj\" (UID: \"f226bc96-3265-4f70-8949-e0acba8ad2da\") " pod="openshift-route-controller-manager/route-controller-manager-7c5bcd9558-68fpj" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.726613 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59e44cd9-beba-43f0-966d-32147af5d418-serving-cert\") pod \"controller-manager-85c5bddd8b-x7wlb\" (UID: \"59e44cd9-beba-43f0-966d-32147af5d418\") " pod="openshift-controller-manager/controller-manager-85c5bddd8b-x7wlb" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.726627 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f226bc96-3265-4f70-8949-e0acba8ad2da-serving-cert\") pod \"route-controller-manager-7c5bcd9558-68fpj\" (UID: \"f226bc96-3265-4f70-8949-e0acba8ad2da\") " pod="openshift-route-controller-manager/route-controller-manager-7c5bcd9558-68fpj" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.726664 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/59e44cd9-beba-43f0-966d-32147af5d418-proxy-ca-bundles\") pod \"controller-manager-85c5bddd8b-x7wlb\" (UID: \"59e44cd9-beba-43f0-966d-32147af5d418\") " pod="openshift-controller-manager/controller-manager-85c5bddd8b-x7wlb" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.726699 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f226bc96-3265-4f70-8949-e0acba8ad2da-config\") pod \"route-controller-manager-7c5bcd9558-68fpj\" (UID: \"f226bc96-3265-4f70-8949-e0acba8ad2da\") " pod="openshift-route-controller-manager/route-controller-manager-7c5bcd9558-68fpj" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.726713 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59e44cd9-beba-43f0-966d-32147af5d418-config\") pod \"controller-manager-85c5bddd8b-x7wlb\" (UID: \"59e44cd9-beba-43f0-966d-32147af5d418\") " pod="openshift-controller-manager/controller-manager-85c5bddd8b-x7wlb" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.728123 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59e44cd9-beba-43f0-966d-32147af5d418-config\") pod \"controller-manager-85c5bddd8b-x7wlb\" (UID: \"59e44cd9-beba-43f0-966d-32147af5d418\") " pod="openshift-controller-manager/controller-manager-85c5bddd8b-x7wlb" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.728335 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59e44cd9-beba-43f0-966d-32147af5d418-client-ca\") pod \"controller-manager-85c5bddd8b-x7wlb\" (UID: \"59e44cd9-beba-43f0-966d-32147af5d418\") " pod="openshift-controller-manager/controller-manager-85c5bddd8b-x7wlb" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.728335 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f226bc96-3265-4f70-8949-e0acba8ad2da-client-ca\") pod \"route-controller-manager-7c5bcd9558-68fpj\" (UID: \"f226bc96-3265-4f70-8949-e0acba8ad2da\") " pod="openshift-route-controller-manager/route-controller-manager-7c5bcd9558-68fpj" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.728933 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/59e44cd9-beba-43f0-966d-32147af5d418-proxy-ca-bundles\") pod \"controller-manager-85c5bddd8b-x7wlb\" (UID: \"59e44cd9-beba-43f0-966d-32147af5d418\") " pod="openshift-controller-manager/controller-manager-85c5bddd8b-x7wlb" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.729036 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f226bc96-3265-4f70-8949-e0acba8ad2da-config\") pod \"route-controller-manager-7c5bcd9558-68fpj\" (UID: \"f226bc96-3265-4f70-8949-e0acba8ad2da\") " pod="openshift-route-controller-manager/route-controller-manager-7c5bcd9558-68fpj" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.730407 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59e44cd9-beba-43f0-966d-32147af5d418-serving-cert\") pod \"controller-manager-85c5bddd8b-x7wlb\" (UID: \"59e44cd9-beba-43f0-966d-32147af5d418\") " pod="openshift-controller-manager/controller-manager-85c5bddd8b-x7wlb" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.730870 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f226bc96-3265-4f70-8949-e0acba8ad2da-serving-cert\") pod \"route-controller-manager-7c5bcd9558-68fpj\" (UID: \"f226bc96-3265-4f70-8949-e0acba8ad2da\") " pod="openshift-route-controller-manager/route-controller-manager-7c5bcd9558-68fpj" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.750493 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb2pm\" (UniqueName: \"kubernetes.io/projected/59e44cd9-beba-43f0-966d-32147af5d418-kube-api-access-zb2pm\") pod \"controller-manager-85c5bddd8b-x7wlb\" (UID: \"59e44cd9-beba-43f0-966d-32147af5d418\") " pod="openshift-controller-manager/controller-manager-85c5bddd8b-x7wlb" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.754582 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksbmk\" (UniqueName: \"kubernetes.io/projected/f226bc96-3265-4f70-8949-e0acba8ad2da-kube-api-access-ksbmk\") pod \"route-controller-manager-7c5bcd9558-68fpj\" (UID: \"f226bc96-3265-4f70-8949-e0acba8ad2da\") " pod="openshift-route-controller-manager/route-controller-manager-7c5bcd9558-68fpj" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.843586 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85c5bddd8b-x7wlb" Feb 28 04:16:33 crc kubenswrapper[5072]: I0228 04:16:33.856303 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c5bcd9558-68fpj" Feb 28 04:16:34 crc kubenswrapper[5072]: I0228 04:16:34.242329 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-85c5bddd8b-x7wlb"] Feb 28 04:16:34 crc kubenswrapper[5072]: I0228 04:16:34.316916 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c5bcd9558-68fpj"] Feb 28 04:16:34 crc kubenswrapper[5072]: W0228 04:16:34.319750 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf226bc96_3265_4f70_8949_e0acba8ad2da.slice/crio-1bcf521df495f3890db6b1733e97bccc2875ca1494c89498ae6cc3cd86f1fbe1 WatchSource:0}: Error finding container 1bcf521df495f3890db6b1733e97bccc2875ca1494c89498ae6cc3cd86f1fbe1: Status 404 returned error can't find the container with id 1bcf521df495f3890db6b1733e97bccc2875ca1494c89498ae6cc3cd86f1fbe1 Feb 28 04:16:34 crc kubenswrapper[5072]: I0228 04:16:34.666737 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f60e1fb-dc64-426f-be47-dbb31f745383" path="/var/lib/kubelet/pods/3f60e1fb-dc64-426f-be47-dbb31f745383/volumes" Feb 28 04:16:34 crc kubenswrapper[5072]: I0228 04:16:34.668112 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f79e8d0f-72bc-4bb6-a74b-63cad57fb129" path="/var/lib/kubelet/pods/f79e8d0f-72bc-4bb6-a74b-63cad57fb129/volumes" Feb 28 04:16:35 crc kubenswrapper[5072]: I0228 04:16:35.047706 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c5bcd9558-68fpj" event={"ID":"f226bc96-3265-4f70-8949-e0acba8ad2da","Type":"ContainerStarted","Data":"b361442f80490163a76ab240f4d7a1b1cba7b7b6a45dcc651fd4eba601ea6351"} Feb 28 04:16:35 crc kubenswrapper[5072]: I0228 04:16:35.047741 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c5bcd9558-68fpj" event={"ID":"f226bc96-3265-4f70-8949-e0acba8ad2da","Type":"ContainerStarted","Data":"1bcf521df495f3890db6b1733e97bccc2875ca1494c89498ae6cc3cd86f1fbe1"} Feb 28 04:16:35 crc kubenswrapper[5072]: I0228 04:16:35.048628 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7c5bcd9558-68fpj" Feb 28 04:16:35 crc kubenswrapper[5072]: I0228 04:16:35.049836 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-85c5bddd8b-x7wlb" event={"ID":"59e44cd9-beba-43f0-966d-32147af5d418","Type":"ContainerStarted","Data":"dd432232af3ef6f83d9390b280f1ca54c70361ada7bd2437041740e0d1272bf2"} Feb 28 04:16:35 crc kubenswrapper[5072]: I0228 04:16:35.050790 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-85c5bddd8b-x7wlb" event={"ID":"59e44cd9-beba-43f0-966d-32147af5d418","Type":"ContainerStarted","Data":"243b6d8d84b1e35e01c1db2ceb9c94fdd90442e8077feadf030d057c3167d1c4"} Feb 28 04:16:35 crc kubenswrapper[5072]: I0228 04:16:35.050816 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-85c5bddd8b-x7wlb" Feb 28 04:16:35 crc kubenswrapper[5072]: I0228 04:16:35.055085 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-85c5bddd8b-x7wlb" Feb 28 04:16:35 crc kubenswrapper[5072]: I0228 04:16:35.055484 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7c5bcd9558-68fpj" Feb 28 04:16:35 crc kubenswrapper[5072]: I0228 04:16:35.065137 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7c5bcd9558-68fpj" podStartSLOduration=3.065124746 podStartE2EDuration="3.065124746s" podCreationTimestamp="2026-02-28 04:16:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:16:35.063557757 +0000 UTC m=+417.058287949" watchObservedRunningTime="2026-02-28 04:16:35.065124746 +0000 UTC m=+417.059854938" Feb 28 04:16:35 crc kubenswrapper[5072]: I0228 04:16:35.084159 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-85c5bddd8b-x7wlb" podStartSLOduration=3.084139758 podStartE2EDuration="3.084139758s" podCreationTimestamp="2026-02-28 04:16:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:16:35.079663218 +0000 UTC m=+417.074393410" watchObservedRunningTime="2026-02-28 04:16:35.084139758 +0000 UTC m=+417.078869950" Feb 28 04:16:50 crc kubenswrapper[5072]: I0228 04:16:50.106317 5072 patch_prober.go:28] interesting pod/machine-config-daemon-5lrpf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:16:50 crc kubenswrapper[5072]: I0228 04:16:50.106854 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:16:56 crc kubenswrapper[5072]: I0228 04:16:56.451484 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-ft9tz"] Feb 28 04:16:56 crc kubenswrapper[5072]: I0228 04:16:56.453003 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-ft9tz" Feb 28 04:16:56 crc kubenswrapper[5072]: I0228 04:16:56.475111 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-ft9tz"] Feb 28 04:16:56 crc kubenswrapper[5072]: I0228 04:16:56.610545 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3c9f64dc-265f-449e-83ae-997e596a1d37-ca-trust-extracted\") pod \"image-registry-66df7c8f76-ft9tz\" (UID: \"3c9f64dc-265f-449e-83ae-997e596a1d37\") " pod="openshift-image-registry/image-registry-66df7c8f76-ft9tz" Feb 28 04:16:56 crc kubenswrapper[5072]: I0228 04:16:56.610602 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb897\" (UniqueName: \"kubernetes.io/projected/3c9f64dc-265f-449e-83ae-997e596a1d37-kube-api-access-gb897\") pod \"image-registry-66df7c8f76-ft9tz\" (UID: \"3c9f64dc-265f-449e-83ae-997e596a1d37\") " pod="openshift-image-registry/image-registry-66df7c8f76-ft9tz" Feb 28 04:16:56 crc kubenswrapper[5072]: I0228 04:16:56.610658 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3c9f64dc-265f-449e-83ae-997e596a1d37-registry-tls\") pod \"image-registry-66df7c8f76-ft9tz\" (UID: \"3c9f64dc-265f-449e-83ae-997e596a1d37\") " pod="openshift-image-registry/image-registry-66df7c8f76-ft9tz" Feb 28 04:16:56 crc kubenswrapper[5072]: I0228 04:16:56.610684 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3c9f64dc-265f-449e-83ae-997e596a1d37-bound-sa-token\") pod \"image-registry-66df7c8f76-ft9tz\" (UID: \"3c9f64dc-265f-449e-83ae-997e596a1d37\") " pod="openshift-image-registry/image-registry-66df7c8f76-ft9tz" Feb 28 04:16:56 crc kubenswrapper[5072]: I0228 04:16:56.610737 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-ft9tz\" (UID: \"3c9f64dc-265f-449e-83ae-997e596a1d37\") " pod="openshift-image-registry/image-registry-66df7c8f76-ft9tz" Feb 28 04:16:56 crc kubenswrapper[5072]: I0228 04:16:56.610764 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3c9f64dc-265f-449e-83ae-997e596a1d37-installation-pull-secrets\") pod \"image-registry-66df7c8f76-ft9tz\" (UID: \"3c9f64dc-265f-449e-83ae-997e596a1d37\") " pod="openshift-image-registry/image-registry-66df7c8f76-ft9tz" Feb 28 04:16:56 crc kubenswrapper[5072]: I0228 04:16:56.610788 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3c9f64dc-265f-449e-83ae-997e596a1d37-trusted-ca\") pod \"image-registry-66df7c8f76-ft9tz\" (UID: \"3c9f64dc-265f-449e-83ae-997e596a1d37\") " pod="openshift-image-registry/image-registry-66df7c8f76-ft9tz" Feb 28 04:16:56 crc kubenswrapper[5072]: I0228 04:16:56.610844 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3c9f64dc-265f-449e-83ae-997e596a1d37-registry-certificates\") pod \"image-registry-66df7c8f76-ft9tz\" (UID: \"3c9f64dc-265f-449e-83ae-997e596a1d37\") " pod="openshift-image-registry/image-registry-66df7c8f76-ft9tz" Feb 28 04:16:56 crc kubenswrapper[5072]: I0228 04:16:56.640327 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-ft9tz\" (UID: \"3c9f64dc-265f-449e-83ae-997e596a1d37\") " pod="openshift-image-registry/image-registry-66df7c8f76-ft9tz" Feb 28 04:16:56 crc kubenswrapper[5072]: I0228 04:16:56.712570 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3c9f64dc-265f-449e-83ae-997e596a1d37-registry-certificates\") pod \"image-registry-66df7c8f76-ft9tz\" (UID: \"3c9f64dc-265f-449e-83ae-997e596a1d37\") " pod="openshift-image-registry/image-registry-66df7c8f76-ft9tz" Feb 28 04:16:56 crc kubenswrapper[5072]: I0228 04:16:56.712671 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3c9f64dc-265f-449e-83ae-997e596a1d37-ca-trust-extracted\") pod \"image-registry-66df7c8f76-ft9tz\" (UID: \"3c9f64dc-265f-449e-83ae-997e596a1d37\") " pod="openshift-image-registry/image-registry-66df7c8f76-ft9tz" Feb 28 04:16:56 crc kubenswrapper[5072]: I0228 04:16:56.712725 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb897\" (UniqueName: \"kubernetes.io/projected/3c9f64dc-265f-449e-83ae-997e596a1d37-kube-api-access-gb897\") pod \"image-registry-66df7c8f76-ft9tz\" (UID: \"3c9f64dc-265f-449e-83ae-997e596a1d37\") " pod="openshift-image-registry/image-registry-66df7c8f76-ft9tz" Feb 28 04:16:56 crc kubenswrapper[5072]: I0228 04:16:56.712774 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3c9f64dc-265f-449e-83ae-997e596a1d37-registry-tls\") pod \"image-registry-66df7c8f76-ft9tz\" (UID: \"3c9f64dc-265f-449e-83ae-997e596a1d37\") " pod="openshift-image-registry/image-registry-66df7c8f76-ft9tz" Feb 28 04:16:56 crc kubenswrapper[5072]: I0228 04:16:56.712813 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3c9f64dc-265f-449e-83ae-997e596a1d37-bound-sa-token\") pod \"image-registry-66df7c8f76-ft9tz\" (UID: \"3c9f64dc-265f-449e-83ae-997e596a1d37\") " pod="openshift-image-registry/image-registry-66df7c8f76-ft9tz" Feb 28 04:16:56 crc kubenswrapper[5072]: I0228 04:16:56.712874 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3c9f64dc-265f-449e-83ae-997e596a1d37-installation-pull-secrets\") pod \"image-registry-66df7c8f76-ft9tz\" (UID: \"3c9f64dc-265f-449e-83ae-997e596a1d37\") " pod="openshift-image-registry/image-registry-66df7c8f76-ft9tz" Feb 28 04:16:56 crc kubenswrapper[5072]: I0228 04:16:56.712913 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3c9f64dc-265f-449e-83ae-997e596a1d37-trusted-ca\") pod \"image-registry-66df7c8f76-ft9tz\" (UID: \"3c9f64dc-265f-449e-83ae-997e596a1d37\") " pod="openshift-image-registry/image-registry-66df7c8f76-ft9tz" Feb 28 04:16:56 crc kubenswrapper[5072]: I0228 04:16:56.716967 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3c9f64dc-265f-449e-83ae-997e596a1d37-ca-trust-extracted\") pod \"image-registry-66df7c8f76-ft9tz\" (UID: \"3c9f64dc-265f-449e-83ae-997e596a1d37\") " pod="openshift-image-registry/image-registry-66df7c8f76-ft9tz" Feb 28 04:16:56 crc kubenswrapper[5072]: I0228 04:16:56.717932 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3c9f64dc-265f-449e-83ae-997e596a1d37-trusted-ca\") pod \"image-registry-66df7c8f76-ft9tz\" (UID: \"3c9f64dc-265f-449e-83ae-997e596a1d37\") " pod="openshift-image-registry/image-registry-66df7c8f76-ft9tz" Feb 28 04:16:56 crc kubenswrapper[5072]: I0228 04:16:56.718084 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3c9f64dc-265f-449e-83ae-997e596a1d37-registry-certificates\") pod \"image-registry-66df7c8f76-ft9tz\" (UID: \"3c9f64dc-265f-449e-83ae-997e596a1d37\") " pod="openshift-image-registry/image-registry-66df7c8f76-ft9tz" Feb 28 04:16:56 crc kubenswrapper[5072]: I0228 04:16:56.722285 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3c9f64dc-265f-449e-83ae-997e596a1d37-registry-tls\") pod \"image-registry-66df7c8f76-ft9tz\" (UID: \"3c9f64dc-265f-449e-83ae-997e596a1d37\") " pod="openshift-image-registry/image-registry-66df7c8f76-ft9tz" Feb 28 04:16:56 crc kubenswrapper[5072]: I0228 04:16:56.722364 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3c9f64dc-265f-449e-83ae-997e596a1d37-installation-pull-secrets\") pod \"image-registry-66df7c8f76-ft9tz\" (UID: \"3c9f64dc-265f-449e-83ae-997e596a1d37\") " pod="openshift-image-registry/image-registry-66df7c8f76-ft9tz" Feb 28 04:16:56 crc kubenswrapper[5072]: I0228 04:16:56.731539 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3c9f64dc-265f-449e-83ae-997e596a1d37-bound-sa-token\") pod \"image-registry-66df7c8f76-ft9tz\" (UID: \"3c9f64dc-265f-449e-83ae-997e596a1d37\") " pod="openshift-image-registry/image-registry-66df7c8f76-ft9tz" Feb 28 04:16:56 crc kubenswrapper[5072]: I0228 04:16:56.731804 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb897\" (UniqueName: \"kubernetes.io/projected/3c9f64dc-265f-449e-83ae-997e596a1d37-kube-api-access-gb897\") pod \"image-registry-66df7c8f76-ft9tz\" (UID: \"3c9f64dc-265f-449e-83ae-997e596a1d37\") " pod="openshift-image-registry/image-registry-66df7c8f76-ft9tz" Feb 28 04:16:56 crc kubenswrapper[5072]: I0228 04:16:56.780280 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-ft9tz" Feb 28 04:16:57 crc kubenswrapper[5072]: I0228 04:16:57.287715 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-ft9tz"] Feb 28 04:16:58 crc kubenswrapper[5072]: I0228 04:16:58.187371 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-ft9tz" event={"ID":"3c9f64dc-265f-449e-83ae-997e596a1d37","Type":"ContainerStarted","Data":"1611734d0bb578a3d35d61795b405b1ddfe02b6303f8f76df2e22a8531fd689f"} Feb 28 04:16:58 crc kubenswrapper[5072]: I0228 04:16:58.187545 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-ft9tz" event={"ID":"3c9f64dc-265f-449e-83ae-997e596a1d37","Type":"ContainerStarted","Data":"30549c58e0e48af554ff28dfbc6c16d477c17e071b377f2af51bf3624247ab2a"} Feb 28 04:16:58 crc kubenswrapper[5072]: I0228 04:16:58.187560 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-ft9tz" Feb 28 04:16:58 crc kubenswrapper[5072]: I0228 04:16:58.211374 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-ft9tz" podStartSLOduration=2.211352861 podStartE2EDuration="2.211352861s" podCreationTimestamp="2026-02-28 04:16:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:16:58.208759701 +0000 UTC m=+440.203489903" watchObservedRunningTime="2026-02-28 04:16:58.211352861 +0000 UTC m=+440.206083053" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.177943 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b4wcw"] Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.178882 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b4wcw" podUID="9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23" containerName="registry-server" containerID="cri-o://04cbbd3883faa859c59f9f753cd5425652dad12d80aab0b6f8cf4f52a7c03ae0" gracePeriod=30 Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.183667 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g7hzc"] Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.185537 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-g7hzc" podUID="11731378-2c2a-448a-918f-2e2f07619ee0" containerName="registry-server" containerID="cri-o://845cde7cd0fbca2f3eca107d6005bf76ca9e5e0f4078734f3731f5b0c99a6f2b" gracePeriod=30 Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.192120 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-stshz"] Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.192502 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-stshz" podUID="56c867eb-6fd4-476a-8317-9f590f2ff47a" containerName="marketplace-operator" containerID="cri-o://4883f9f62ce5afb1f90f8d181961be5632d65b871527a1c6081bea01aabff85b" gracePeriod=30 Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.203702 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gw7ld"] Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.203941 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gw7ld" podUID="96c8a41b-5700-46e9-bea3-aac12066069f" containerName="registry-server" containerID="cri-o://e70e85cf32797a7de5c019358d0170dcd3592259184110912a9a8cc2c58c746d" gracePeriod=30 Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.215506 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-67nm2"] Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.215853 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-67nm2" podUID="c68fedb7-ccc2-4f59-8b91-7a59776ccd1d" containerName="registry-server" containerID="cri-o://b61639a058f6c52f053e4a06c2159e9af6a403b10bfcf0b20f4d908e6ceeca55" gracePeriod=30 Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.221284 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jxjmq"] Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.222136 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jxjmq" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.240553 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jxjmq"] Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.379407 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf5cb269-db5d-4b8d-ba70-a583c95dd586-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jxjmq\" (UID: \"cf5cb269-db5d-4b8d-ba70-a583c95dd586\") " pod="openshift-marketplace/marketplace-operator-79b997595-jxjmq" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.379733 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpd8j\" (UniqueName: \"kubernetes.io/projected/cf5cb269-db5d-4b8d-ba70-a583c95dd586-kube-api-access-hpd8j\") pod \"marketplace-operator-79b997595-jxjmq\" (UID: \"cf5cb269-db5d-4b8d-ba70-a583c95dd586\") " pod="openshift-marketplace/marketplace-operator-79b997595-jxjmq" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.379752 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cf5cb269-db5d-4b8d-ba70-a583c95dd586-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jxjmq\" (UID: \"cf5cb269-db5d-4b8d-ba70-a583c95dd586\") " pod="openshift-marketplace/marketplace-operator-79b997595-jxjmq" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.481109 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf5cb269-db5d-4b8d-ba70-a583c95dd586-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jxjmq\" (UID: \"cf5cb269-db5d-4b8d-ba70-a583c95dd586\") " pod="openshift-marketplace/marketplace-operator-79b997595-jxjmq" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.481181 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpd8j\" (UniqueName: \"kubernetes.io/projected/cf5cb269-db5d-4b8d-ba70-a583c95dd586-kube-api-access-hpd8j\") pod \"marketplace-operator-79b997595-jxjmq\" (UID: \"cf5cb269-db5d-4b8d-ba70-a583c95dd586\") " pod="openshift-marketplace/marketplace-operator-79b997595-jxjmq" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.481200 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cf5cb269-db5d-4b8d-ba70-a583c95dd586-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jxjmq\" (UID: \"cf5cb269-db5d-4b8d-ba70-a583c95dd586\") " pod="openshift-marketplace/marketplace-operator-79b997595-jxjmq" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.482429 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf5cb269-db5d-4b8d-ba70-a583c95dd586-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jxjmq\" (UID: \"cf5cb269-db5d-4b8d-ba70-a583c95dd586\") " pod="openshift-marketplace/marketplace-operator-79b997595-jxjmq" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.488549 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cf5cb269-db5d-4b8d-ba70-a583c95dd586-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jxjmq\" (UID: \"cf5cb269-db5d-4b8d-ba70-a583c95dd586\") " pod="openshift-marketplace/marketplace-operator-79b997595-jxjmq" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.501539 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpd8j\" (UniqueName: \"kubernetes.io/projected/cf5cb269-db5d-4b8d-ba70-a583c95dd586-kube-api-access-hpd8j\") pod \"marketplace-operator-79b997595-jxjmq\" (UID: \"cf5cb269-db5d-4b8d-ba70-a583c95dd586\") " pod="openshift-marketplace/marketplace-operator-79b997595-jxjmq" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.707412 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jxjmq" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.718501 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b4wcw" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.791104 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gw7ld" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.792332 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-stshz" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.810473 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-67nm2" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.830583 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g7hzc" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.891103 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjn5w\" (UniqueName: \"kubernetes.io/projected/56c867eb-6fd4-476a-8317-9f590f2ff47a-kube-api-access-hjn5w\") pod \"56c867eb-6fd4-476a-8317-9f590f2ff47a\" (UID: \"56c867eb-6fd4-476a-8317-9f590f2ff47a\") " Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.891148 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkkj4\" (UniqueName: \"kubernetes.io/projected/9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23-kube-api-access-wkkj4\") pod \"9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23\" (UID: \"9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23\") " Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.891166 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6c7n7\" (UniqueName: \"kubernetes.io/projected/c68fedb7-ccc2-4f59-8b91-7a59776ccd1d-kube-api-access-6c7n7\") pod \"c68fedb7-ccc2-4f59-8b91-7a59776ccd1d\" (UID: \"c68fedb7-ccc2-4f59-8b91-7a59776ccd1d\") " Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.891249 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23-catalog-content\") pod \"9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23\" (UID: \"9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23\") " Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.891269 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96c8a41b-5700-46e9-bea3-aac12066069f-catalog-content\") pod \"96c8a41b-5700-46e9-bea3-aac12066069f\" (UID: \"96c8a41b-5700-46e9-bea3-aac12066069f\") " Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.891291 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96c8a41b-5700-46e9-bea3-aac12066069f-utilities\") pod \"96c8a41b-5700-46e9-bea3-aac12066069f\" (UID: \"96c8a41b-5700-46e9-bea3-aac12066069f\") " Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.891316 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c68fedb7-ccc2-4f59-8b91-7a59776ccd1d-catalog-content\") pod \"c68fedb7-ccc2-4f59-8b91-7a59776ccd1d\" (UID: \"c68fedb7-ccc2-4f59-8b91-7a59776ccd1d\") " Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.891338 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjqsz\" (UniqueName: \"kubernetes.io/projected/96c8a41b-5700-46e9-bea3-aac12066069f-kube-api-access-mjqsz\") pod \"96c8a41b-5700-46e9-bea3-aac12066069f\" (UID: \"96c8a41b-5700-46e9-bea3-aac12066069f\") " Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.891357 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/56c867eb-6fd4-476a-8317-9f590f2ff47a-marketplace-operator-metrics\") pod \"56c867eb-6fd4-476a-8317-9f590f2ff47a\" (UID: \"56c867eb-6fd4-476a-8317-9f590f2ff47a\") " Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.891379 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c68fedb7-ccc2-4f59-8b91-7a59776ccd1d-utilities\") pod \"c68fedb7-ccc2-4f59-8b91-7a59776ccd1d\" (UID: \"c68fedb7-ccc2-4f59-8b91-7a59776ccd1d\") " Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.891400 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/56c867eb-6fd4-476a-8317-9f590f2ff47a-marketplace-trusted-ca\") pod \"56c867eb-6fd4-476a-8317-9f590f2ff47a\" (UID: \"56c867eb-6fd4-476a-8317-9f590f2ff47a\") " Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.891424 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23-utilities\") pod \"9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23\" (UID: \"9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23\") " Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.893751 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23-utilities" (OuterVolumeSpecName: "utilities") pod "9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23" (UID: "9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.895167 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96c8a41b-5700-46e9-bea3-aac12066069f-utilities" (OuterVolumeSpecName: "utilities") pod "96c8a41b-5700-46e9-bea3-aac12066069f" (UID: "96c8a41b-5700-46e9-bea3-aac12066069f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.898986 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23-kube-api-access-wkkj4" (OuterVolumeSpecName: "kube-api-access-wkkj4") pod "9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23" (UID: "9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23"). InnerVolumeSpecName "kube-api-access-wkkj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.899708 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56c867eb-6fd4-476a-8317-9f590f2ff47a-kube-api-access-hjn5w" (OuterVolumeSpecName: "kube-api-access-hjn5w") pod "56c867eb-6fd4-476a-8317-9f590f2ff47a" (UID: "56c867eb-6fd4-476a-8317-9f590f2ff47a"). InnerVolumeSpecName "kube-api-access-hjn5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.900400 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c68fedb7-ccc2-4f59-8b91-7a59776ccd1d-utilities" (OuterVolumeSpecName: "utilities") pod "c68fedb7-ccc2-4f59-8b91-7a59776ccd1d" (UID: "c68fedb7-ccc2-4f59-8b91-7a59776ccd1d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.901032 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56c867eb-6fd4-476a-8317-9f590f2ff47a-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "56c867eb-6fd4-476a-8317-9f590f2ff47a" (UID: "56c867eb-6fd4-476a-8317-9f590f2ff47a"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.912567 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96c8a41b-5700-46e9-bea3-aac12066069f-kube-api-access-mjqsz" (OuterVolumeSpecName: "kube-api-access-mjqsz") pod "96c8a41b-5700-46e9-bea3-aac12066069f" (UID: "96c8a41b-5700-46e9-bea3-aac12066069f"). InnerVolumeSpecName "kube-api-access-mjqsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.912751 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c68fedb7-ccc2-4f59-8b91-7a59776ccd1d-kube-api-access-6c7n7" (OuterVolumeSpecName: "kube-api-access-6c7n7") pod "c68fedb7-ccc2-4f59-8b91-7a59776ccd1d" (UID: "c68fedb7-ccc2-4f59-8b91-7a59776ccd1d"). InnerVolumeSpecName "kube-api-access-6c7n7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.913235 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c867eb-6fd4-476a-8317-9f590f2ff47a-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "56c867eb-6fd4-476a-8317-9f590f2ff47a" (UID: "56c867eb-6fd4-476a-8317-9f590f2ff47a"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.934332 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96c8a41b-5700-46e9-bea3-aac12066069f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96c8a41b-5700-46e9-bea3-aac12066069f" (UID: "96c8a41b-5700-46e9-bea3-aac12066069f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.962106 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23" (UID: "9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.993065 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rhc6\" (UniqueName: \"kubernetes.io/projected/11731378-2c2a-448a-918f-2e2f07619ee0-kube-api-access-4rhc6\") pod \"11731378-2c2a-448a-918f-2e2f07619ee0\" (UID: \"11731378-2c2a-448a-918f-2e2f07619ee0\") " Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.993107 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11731378-2c2a-448a-918f-2e2f07619ee0-catalog-content\") pod \"11731378-2c2a-448a-918f-2e2f07619ee0\" (UID: \"11731378-2c2a-448a-918f-2e2f07619ee0\") " Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.993189 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11731378-2c2a-448a-918f-2e2f07619ee0-utilities\") pod \"11731378-2c2a-448a-918f-2e2f07619ee0\" (UID: \"11731378-2c2a-448a-918f-2e2f07619ee0\") " Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.993404 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkkj4\" (UniqueName: \"kubernetes.io/projected/9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23-kube-api-access-wkkj4\") on node \"crc\" DevicePath \"\"" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.993414 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6c7n7\" (UniqueName: \"kubernetes.io/projected/c68fedb7-ccc2-4f59-8b91-7a59776ccd1d-kube-api-access-6c7n7\") on node \"crc\" DevicePath \"\"" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.993424 5072 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.993432 5072 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96c8a41b-5700-46e9-bea3-aac12066069f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.993440 5072 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96c8a41b-5700-46e9-bea3-aac12066069f-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.993450 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjqsz\" (UniqueName: \"kubernetes.io/projected/96c8a41b-5700-46e9-bea3-aac12066069f-kube-api-access-mjqsz\") on node \"crc\" DevicePath \"\"" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.993458 5072 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/56c867eb-6fd4-476a-8317-9f590f2ff47a-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.993468 5072 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c68fedb7-ccc2-4f59-8b91-7a59776ccd1d-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.993476 5072 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/56c867eb-6fd4-476a-8317-9f590f2ff47a-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.993484 5072 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.993495 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjn5w\" (UniqueName: \"kubernetes.io/projected/56c867eb-6fd4-476a-8317-9f590f2ff47a-kube-api-access-hjn5w\") on node \"crc\" DevicePath \"\"" Feb 28 04:17:07 crc kubenswrapper[5072]: I0228 04:17:07.994750 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11731378-2c2a-448a-918f-2e2f07619ee0-utilities" (OuterVolumeSpecName: "utilities") pod "11731378-2c2a-448a-918f-2e2f07619ee0" (UID: "11731378-2c2a-448a-918f-2e2f07619ee0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:07.998832 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11731378-2c2a-448a-918f-2e2f07619ee0-kube-api-access-4rhc6" (OuterVolumeSpecName: "kube-api-access-4rhc6") pod "11731378-2c2a-448a-918f-2e2f07619ee0" (UID: "11731378-2c2a-448a-918f-2e2f07619ee0"). InnerVolumeSpecName "kube-api-access-4rhc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.027857 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c68fedb7-ccc2-4f59-8b91-7a59776ccd1d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c68fedb7-ccc2-4f59-8b91-7a59776ccd1d" (UID: "c68fedb7-ccc2-4f59-8b91-7a59776ccd1d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.049359 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11731378-2c2a-448a-918f-2e2f07619ee0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "11731378-2c2a-448a-918f-2e2f07619ee0" (UID: "11731378-2c2a-448a-918f-2e2f07619ee0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.094926 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rhc6\" (UniqueName: \"kubernetes.io/projected/11731378-2c2a-448a-918f-2e2f07619ee0-kube-api-access-4rhc6\") on node \"crc\" DevicePath \"\"" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.094958 5072 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11731378-2c2a-448a-918f-2e2f07619ee0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.094967 5072 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11731378-2c2a-448a-918f-2e2f07619ee0-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.094976 5072 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c68fedb7-ccc2-4f59-8b91-7a59776ccd1d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.140794 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jxjmq"] Feb 28 04:17:08 crc kubenswrapper[5072]: W0228 04:17:08.144484 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf5cb269_db5d_4b8d_ba70_a583c95dd586.slice/crio-3d3e41065013007bbfd818bf1dbdf565abceabcbf989069cff35b236afd2b1da WatchSource:0}: Error finding container 3d3e41065013007bbfd818bf1dbdf565abceabcbf989069cff35b236afd2b1da: Status 404 returned error can't find the container with id 3d3e41065013007bbfd818bf1dbdf565abceabcbf989069cff35b236afd2b1da Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.253556 5072 generic.go:334] "Generic (PLEG): container finished" podID="9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23" containerID="04cbbd3883faa859c59f9f753cd5425652dad12d80aab0b6f8cf4f52a7c03ae0" exitCode=0 Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.253688 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b4wcw" event={"ID":"9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23","Type":"ContainerDied","Data":"04cbbd3883faa859c59f9f753cd5425652dad12d80aab0b6f8cf4f52a7c03ae0"} Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.253773 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b4wcw" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.253956 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b4wcw" event={"ID":"9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23","Type":"ContainerDied","Data":"569779664f7de75250a9a132151d2fd297579f066c8d1d1c75f0ec9fe5c16c37"} Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.253985 5072 scope.go:117] "RemoveContainer" containerID="04cbbd3883faa859c59f9f753cd5425652dad12d80aab0b6f8cf4f52a7c03ae0" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.257581 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jxjmq" event={"ID":"cf5cb269-db5d-4b8d-ba70-a583c95dd586","Type":"ContainerStarted","Data":"3d3e41065013007bbfd818bf1dbdf565abceabcbf989069cff35b236afd2b1da"} Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.258724 5072 generic.go:334] "Generic (PLEG): container finished" podID="56c867eb-6fd4-476a-8317-9f590f2ff47a" containerID="4883f9f62ce5afb1f90f8d181961be5632d65b871527a1c6081bea01aabff85b" exitCode=0 Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.258780 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-stshz" event={"ID":"56c867eb-6fd4-476a-8317-9f590f2ff47a","Type":"ContainerDied","Data":"4883f9f62ce5afb1f90f8d181961be5632d65b871527a1c6081bea01aabff85b"} Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.258788 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-stshz" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.258798 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-stshz" event={"ID":"56c867eb-6fd4-476a-8317-9f590f2ff47a","Type":"ContainerDied","Data":"2894b982f790fb3fb94bce3351c0c32ffd296a0da8a2350101f917c8dde4e894"} Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.262530 5072 generic.go:334] "Generic (PLEG): container finished" podID="96c8a41b-5700-46e9-bea3-aac12066069f" containerID="e70e85cf32797a7de5c019358d0170dcd3592259184110912a9a8cc2c58c746d" exitCode=0 Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.262584 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gw7ld" event={"ID":"96c8a41b-5700-46e9-bea3-aac12066069f","Type":"ContainerDied","Data":"e70e85cf32797a7de5c019358d0170dcd3592259184110912a9a8cc2c58c746d"} Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.262608 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gw7ld" event={"ID":"96c8a41b-5700-46e9-bea3-aac12066069f","Type":"ContainerDied","Data":"1a76b49afbcbe3ed71b53055d861f9fa26981cc4656539b1209f10ad8d23dfaf"} Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.262674 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gw7ld" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.264872 5072 generic.go:334] "Generic (PLEG): container finished" podID="11731378-2c2a-448a-918f-2e2f07619ee0" containerID="845cde7cd0fbca2f3eca107d6005bf76ca9e5e0f4078734f3731f5b0c99a6f2b" exitCode=0 Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.264921 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7hzc" event={"ID":"11731378-2c2a-448a-918f-2e2f07619ee0","Type":"ContainerDied","Data":"845cde7cd0fbca2f3eca107d6005bf76ca9e5e0f4078734f3731f5b0c99a6f2b"} Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.264939 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7hzc" event={"ID":"11731378-2c2a-448a-918f-2e2f07619ee0","Type":"ContainerDied","Data":"eaf73c18b34021ac2539b6035a061043bab001f53b0e5958a163786405345ef7"} Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.265000 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g7hzc" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.269393 5072 generic.go:334] "Generic (PLEG): container finished" podID="c68fedb7-ccc2-4f59-8b91-7a59776ccd1d" containerID="b61639a058f6c52f053e4a06c2159e9af6a403b10bfcf0b20f4d908e6ceeca55" exitCode=0 Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.269453 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-67nm2" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.269458 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-67nm2" event={"ID":"c68fedb7-ccc2-4f59-8b91-7a59776ccd1d","Type":"ContainerDied","Data":"b61639a058f6c52f053e4a06c2159e9af6a403b10bfcf0b20f4d908e6ceeca55"} Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.269574 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-67nm2" event={"ID":"c68fedb7-ccc2-4f59-8b91-7a59776ccd1d","Type":"ContainerDied","Data":"6810b1b67ea37348b2f21fbd87e6607f601a001be27f480d2d1e6af6ead94daa"} Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.276269 5072 scope.go:117] "RemoveContainer" containerID="75966a5daebdf8990ad131bd823054ca3a7762c3972cc5513b54acd8c8b82a17" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.296889 5072 scope.go:117] "RemoveContainer" containerID="35570e0ac5cc1432a661cc44d86116308570a1f36f7f1d227b23ec043ac9e2f1" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.362116 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gw7ld"] Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.365794 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gw7ld"] Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.373913 5072 scope.go:117] "RemoveContainer" containerID="04cbbd3883faa859c59f9f753cd5425652dad12d80aab0b6f8cf4f52a7c03ae0" Feb 28 04:17:08 crc kubenswrapper[5072]: E0228 04:17:08.377441 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04cbbd3883faa859c59f9f753cd5425652dad12d80aab0b6f8cf4f52a7c03ae0\": container with ID starting with 04cbbd3883faa859c59f9f753cd5425652dad12d80aab0b6f8cf4f52a7c03ae0 not found: ID does not exist" containerID="04cbbd3883faa859c59f9f753cd5425652dad12d80aab0b6f8cf4f52a7c03ae0" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.377496 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04cbbd3883faa859c59f9f753cd5425652dad12d80aab0b6f8cf4f52a7c03ae0"} err="failed to get container status \"04cbbd3883faa859c59f9f753cd5425652dad12d80aab0b6f8cf4f52a7c03ae0\": rpc error: code = NotFound desc = could not find container \"04cbbd3883faa859c59f9f753cd5425652dad12d80aab0b6f8cf4f52a7c03ae0\": container with ID starting with 04cbbd3883faa859c59f9f753cd5425652dad12d80aab0b6f8cf4f52a7c03ae0 not found: ID does not exist" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.377527 5072 scope.go:117] "RemoveContainer" containerID="75966a5daebdf8990ad131bd823054ca3a7762c3972cc5513b54acd8c8b82a17" Feb 28 04:17:08 crc kubenswrapper[5072]: E0228 04:17:08.378019 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75966a5daebdf8990ad131bd823054ca3a7762c3972cc5513b54acd8c8b82a17\": container with ID starting with 75966a5daebdf8990ad131bd823054ca3a7762c3972cc5513b54acd8c8b82a17 not found: ID does not exist" containerID="75966a5daebdf8990ad131bd823054ca3a7762c3972cc5513b54acd8c8b82a17" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.378042 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75966a5daebdf8990ad131bd823054ca3a7762c3972cc5513b54acd8c8b82a17"} err="failed to get container status \"75966a5daebdf8990ad131bd823054ca3a7762c3972cc5513b54acd8c8b82a17\": rpc error: code = NotFound desc = could not find container \"75966a5daebdf8990ad131bd823054ca3a7762c3972cc5513b54acd8c8b82a17\": container with ID starting with 75966a5daebdf8990ad131bd823054ca3a7762c3972cc5513b54acd8c8b82a17 not found: ID does not exist" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.378059 5072 scope.go:117] "RemoveContainer" containerID="35570e0ac5cc1432a661cc44d86116308570a1f36f7f1d227b23ec043ac9e2f1" Feb 28 04:17:08 crc kubenswrapper[5072]: E0228 04:17:08.378582 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35570e0ac5cc1432a661cc44d86116308570a1f36f7f1d227b23ec043ac9e2f1\": container with ID starting with 35570e0ac5cc1432a661cc44d86116308570a1f36f7f1d227b23ec043ac9e2f1 not found: ID does not exist" containerID="35570e0ac5cc1432a661cc44d86116308570a1f36f7f1d227b23ec043ac9e2f1" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.378634 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35570e0ac5cc1432a661cc44d86116308570a1f36f7f1d227b23ec043ac9e2f1"} err="failed to get container status \"35570e0ac5cc1432a661cc44d86116308570a1f36f7f1d227b23ec043ac9e2f1\": rpc error: code = NotFound desc = could not find container \"35570e0ac5cc1432a661cc44d86116308570a1f36f7f1d227b23ec043ac9e2f1\": container with ID starting with 35570e0ac5cc1432a661cc44d86116308570a1f36f7f1d227b23ec043ac9e2f1 not found: ID does not exist" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.378688 5072 scope.go:117] "RemoveContainer" containerID="4883f9f62ce5afb1f90f8d181961be5632d65b871527a1c6081bea01aabff85b" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.390357 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b4wcw"] Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.393494 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b4wcw"] Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.405494 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-stshz"] Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.409709 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-stshz"] Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.411144 5072 scope.go:117] "RemoveContainer" containerID="4883f9f62ce5afb1f90f8d181961be5632d65b871527a1c6081bea01aabff85b" Feb 28 04:17:08 crc kubenswrapper[5072]: E0228 04:17:08.412488 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4883f9f62ce5afb1f90f8d181961be5632d65b871527a1c6081bea01aabff85b\": container with ID starting with 4883f9f62ce5afb1f90f8d181961be5632d65b871527a1c6081bea01aabff85b not found: ID does not exist" containerID="4883f9f62ce5afb1f90f8d181961be5632d65b871527a1c6081bea01aabff85b" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.412543 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4883f9f62ce5afb1f90f8d181961be5632d65b871527a1c6081bea01aabff85b"} err="failed to get container status \"4883f9f62ce5afb1f90f8d181961be5632d65b871527a1c6081bea01aabff85b\": rpc error: code = NotFound desc = could not find container \"4883f9f62ce5afb1f90f8d181961be5632d65b871527a1c6081bea01aabff85b\": container with ID starting with 4883f9f62ce5afb1f90f8d181961be5632d65b871527a1c6081bea01aabff85b not found: ID does not exist" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.412573 5072 scope.go:117] "RemoveContainer" containerID="e70e85cf32797a7de5c019358d0170dcd3592259184110912a9a8cc2c58c746d" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.413198 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-67nm2"] Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.417177 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-67nm2"] Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.430323 5072 scope.go:117] "RemoveContainer" containerID="0b8f60188c82bbea7be85756ace9699b478c23ce20eddc988094f38ae4ea7e7f" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.430472 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g7hzc"] Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.433057 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-g7hzc"] Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.447664 5072 scope.go:117] "RemoveContainer" containerID="a2002edfbc29f58566a267fabfba768cacd51644a5dabddadd91f1a4d0d9b638" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.467288 5072 scope.go:117] "RemoveContainer" containerID="e70e85cf32797a7de5c019358d0170dcd3592259184110912a9a8cc2c58c746d" Feb 28 04:17:08 crc kubenswrapper[5072]: E0228 04:17:08.467943 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e70e85cf32797a7de5c019358d0170dcd3592259184110912a9a8cc2c58c746d\": container with ID starting with e70e85cf32797a7de5c019358d0170dcd3592259184110912a9a8cc2c58c746d not found: ID does not exist" containerID="e70e85cf32797a7de5c019358d0170dcd3592259184110912a9a8cc2c58c746d" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.468013 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e70e85cf32797a7de5c019358d0170dcd3592259184110912a9a8cc2c58c746d"} err="failed to get container status \"e70e85cf32797a7de5c019358d0170dcd3592259184110912a9a8cc2c58c746d\": rpc error: code = NotFound desc = could not find container \"e70e85cf32797a7de5c019358d0170dcd3592259184110912a9a8cc2c58c746d\": container with ID starting with e70e85cf32797a7de5c019358d0170dcd3592259184110912a9a8cc2c58c746d not found: ID does not exist" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.468052 5072 scope.go:117] "RemoveContainer" containerID="0b8f60188c82bbea7be85756ace9699b478c23ce20eddc988094f38ae4ea7e7f" Feb 28 04:17:08 crc kubenswrapper[5072]: E0228 04:17:08.468346 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b8f60188c82bbea7be85756ace9699b478c23ce20eddc988094f38ae4ea7e7f\": container with ID starting with 0b8f60188c82bbea7be85756ace9699b478c23ce20eddc988094f38ae4ea7e7f not found: ID does not exist" containerID="0b8f60188c82bbea7be85756ace9699b478c23ce20eddc988094f38ae4ea7e7f" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.468369 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b8f60188c82bbea7be85756ace9699b478c23ce20eddc988094f38ae4ea7e7f"} err="failed to get container status \"0b8f60188c82bbea7be85756ace9699b478c23ce20eddc988094f38ae4ea7e7f\": rpc error: code = NotFound desc = could not find container \"0b8f60188c82bbea7be85756ace9699b478c23ce20eddc988094f38ae4ea7e7f\": container with ID starting with 0b8f60188c82bbea7be85756ace9699b478c23ce20eddc988094f38ae4ea7e7f not found: ID does not exist" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.468384 5072 scope.go:117] "RemoveContainer" containerID="a2002edfbc29f58566a267fabfba768cacd51644a5dabddadd91f1a4d0d9b638" Feb 28 04:17:08 crc kubenswrapper[5072]: E0228 04:17:08.468595 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2002edfbc29f58566a267fabfba768cacd51644a5dabddadd91f1a4d0d9b638\": container with ID starting with a2002edfbc29f58566a267fabfba768cacd51644a5dabddadd91f1a4d0d9b638 not found: ID does not exist" containerID="a2002edfbc29f58566a267fabfba768cacd51644a5dabddadd91f1a4d0d9b638" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.468617 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2002edfbc29f58566a267fabfba768cacd51644a5dabddadd91f1a4d0d9b638"} err="failed to get container status \"a2002edfbc29f58566a267fabfba768cacd51644a5dabddadd91f1a4d0d9b638\": rpc error: code = NotFound desc = could not find container \"a2002edfbc29f58566a267fabfba768cacd51644a5dabddadd91f1a4d0d9b638\": container with ID starting with a2002edfbc29f58566a267fabfba768cacd51644a5dabddadd91f1a4d0d9b638 not found: ID does not exist" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.468634 5072 scope.go:117] "RemoveContainer" containerID="845cde7cd0fbca2f3eca107d6005bf76ca9e5e0f4078734f3731f5b0c99a6f2b" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.494237 5072 scope.go:117] "RemoveContainer" containerID="ce3eb6486bb03cfef7568bdc22e303afc43b8f21eec8b728356442639bec0163" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.510856 5072 scope.go:117] "RemoveContainer" containerID="5b583e09fa2dbaa146d5a86d3fbc5f8d6cbbfdc14534151497eddbb164611924" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.530113 5072 scope.go:117] "RemoveContainer" containerID="845cde7cd0fbca2f3eca107d6005bf76ca9e5e0f4078734f3731f5b0c99a6f2b" Feb 28 04:17:08 crc kubenswrapper[5072]: E0228 04:17:08.530549 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"845cde7cd0fbca2f3eca107d6005bf76ca9e5e0f4078734f3731f5b0c99a6f2b\": container with ID starting with 845cde7cd0fbca2f3eca107d6005bf76ca9e5e0f4078734f3731f5b0c99a6f2b not found: ID does not exist" containerID="845cde7cd0fbca2f3eca107d6005bf76ca9e5e0f4078734f3731f5b0c99a6f2b" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.530572 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"845cde7cd0fbca2f3eca107d6005bf76ca9e5e0f4078734f3731f5b0c99a6f2b"} err="failed to get container status \"845cde7cd0fbca2f3eca107d6005bf76ca9e5e0f4078734f3731f5b0c99a6f2b\": rpc error: code = NotFound desc = could not find container \"845cde7cd0fbca2f3eca107d6005bf76ca9e5e0f4078734f3731f5b0c99a6f2b\": container with ID starting with 845cde7cd0fbca2f3eca107d6005bf76ca9e5e0f4078734f3731f5b0c99a6f2b not found: ID does not exist" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.530593 5072 scope.go:117] "RemoveContainer" containerID="ce3eb6486bb03cfef7568bdc22e303afc43b8f21eec8b728356442639bec0163" Feb 28 04:17:08 crc kubenswrapper[5072]: E0228 04:17:08.530928 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce3eb6486bb03cfef7568bdc22e303afc43b8f21eec8b728356442639bec0163\": container with ID starting with ce3eb6486bb03cfef7568bdc22e303afc43b8f21eec8b728356442639bec0163 not found: ID does not exist" containerID="ce3eb6486bb03cfef7568bdc22e303afc43b8f21eec8b728356442639bec0163" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.530968 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce3eb6486bb03cfef7568bdc22e303afc43b8f21eec8b728356442639bec0163"} err="failed to get container status \"ce3eb6486bb03cfef7568bdc22e303afc43b8f21eec8b728356442639bec0163\": rpc error: code = NotFound desc = could not find container \"ce3eb6486bb03cfef7568bdc22e303afc43b8f21eec8b728356442639bec0163\": container with ID starting with ce3eb6486bb03cfef7568bdc22e303afc43b8f21eec8b728356442639bec0163 not found: ID does not exist" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.530992 5072 scope.go:117] "RemoveContainer" containerID="5b583e09fa2dbaa146d5a86d3fbc5f8d6cbbfdc14534151497eddbb164611924" Feb 28 04:17:08 crc kubenswrapper[5072]: E0228 04:17:08.531236 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b583e09fa2dbaa146d5a86d3fbc5f8d6cbbfdc14534151497eddbb164611924\": container with ID starting with 5b583e09fa2dbaa146d5a86d3fbc5f8d6cbbfdc14534151497eddbb164611924 not found: ID does not exist" containerID="5b583e09fa2dbaa146d5a86d3fbc5f8d6cbbfdc14534151497eddbb164611924" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.531254 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b583e09fa2dbaa146d5a86d3fbc5f8d6cbbfdc14534151497eddbb164611924"} err="failed to get container status \"5b583e09fa2dbaa146d5a86d3fbc5f8d6cbbfdc14534151497eddbb164611924\": rpc error: code = NotFound desc = could not find container \"5b583e09fa2dbaa146d5a86d3fbc5f8d6cbbfdc14534151497eddbb164611924\": container with ID starting with 5b583e09fa2dbaa146d5a86d3fbc5f8d6cbbfdc14534151497eddbb164611924 not found: ID does not exist" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.531267 5072 scope.go:117] "RemoveContainer" containerID="b61639a058f6c52f053e4a06c2159e9af6a403b10bfcf0b20f4d908e6ceeca55" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.556237 5072 scope.go:117] "RemoveContainer" containerID="7880cb2d594d95e1f17a0f55863b2d1f93416c42e9d181aabc4da55e10e37751" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.573500 5072 scope.go:117] "RemoveContainer" containerID="b465d8e613e43666c53e177c825db98fc3c1e47bc9af171ee391277d6a589d75" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.586026 5072 scope.go:117] "RemoveContainer" containerID="b61639a058f6c52f053e4a06c2159e9af6a403b10bfcf0b20f4d908e6ceeca55" Feb 28 04:17:08 crc kubenswrapper[5072]: E0228 04:17:08.586509 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b61639a058f6c52f053e4a06c2159e9af6a403b10bfcf0b20f4d908e6ceeca55\": container with ID starting with b61639a058f6c52f053e4a06c2159e9af6a403b10bfcf0b20f4d908e6ceeca55 not found: ID does not exist" containerID="b61639a058f6c52f053e4a06c2159e9af6a403b10bfcf0b20f4d908e6ceeca55" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.586555 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b61639a058f6c52f053e4a06c2159e9af6a403b10bfcf0b20f4d908e6ceeca55"} err="failed to get container status \"b61639a058f6c52f053e4a06c2159e9af6a403b10bfcf0b20f4d908e6ceeca55\": rpc error: code = NotFound desc = could not find container \"b61639a058f6c52f053e4a06c2159e9af6a403b10bfcf0b20f4d908e6ceeca55\": container with ID starting with b61639a058f6c52f053e4a06c2159e9af6a403b10bfcf0b20f4d908e6ceeca55 not found: ID does not exist" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.586587 5072 scope.go:117] "RemoveContainer" containerID="7880cb2d594d95e1f17a0f55863b2d1f93416c42e9d181aabc4da55e10e37751" Feb 28 04:17:08 crc kubenswrapper[5072]: E0228 04:17:08.586996 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7880cb2d594d95e1f17a0f55863b2d1f93416c42e9d181aabc4da55e10e37751\": container with ID starting with 7880cb2d594d95e1f17a0f55863b2d1f93416c42e9d181aabc4da55e10e37751 not found: ID does not exist" containerID="7880cb2d594d95e1f17a0f55863b2d1f93416c42e9d181aabc4da55e10e37751" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.587027 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7880cb2d594d95e1f17a0f55863b2d1f93416c42e9d181aabc4da55e10e37751"} err="failed to get container status \"7880cb2d594d95e1f17a0f55863b2d1f93416c42e9d181aabc4da55e10e37751\": rpc error: code = NotFound desc = could not find container \"7880cb2d594d95e1f17a0f55863b2d1f93416c42e9d181aabc4da55e10e37751\": container with ID starting with 7880cb2d594d95e1f17a0f55863b2d1f93416c42e9d181aabc4da55e10e37751 not found: ID does not exist" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.587046 5072 scope.go:117] "RemoveContainer" containerID="b465d8e613e43666c53e177c825db98fc3c1e47bc9af171ee391277d6a589d75" Feb 28 04:17:08 crc kubenswrapper[5072]: E0228 04:17:08.587294 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b465d8e613e43666c53e177c825db98fc3c1e47bc9af171ee391277d6a589d75\": container with ID starting with b465d8e613e43666c53e177c825db98fc3c1e47bc9af171ee391277d6a589d75 not found: ID does not exist" containerID="b465d8e613e43666c53e177c825db98fc3c1e47bc9af171ee391277d6a589d75" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.587326 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b465d8e613e43666c53e177c825db98fc3c1e47bc9af171ee391277d6a589d75"} err="failed to get container status \"b465d8e613e43666c53e177c825db98fc3c1e47bc9af171ee391277d6a589d75\": rpc error: code = NotFound desc = could not find container \"b465d8e613e43666c53e177c825db98fc3c1e47bc9af171ee391277d6a589d75\": container with ID starting with b465d8e613e43666c53e177c825db98fc3c1e47bc9af171ee391277d6a589d75 not found: ID does not exist" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.666568 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11731378-2c2a-448a-918f-2e2f07619ee0" path="/var/lib/kubelet/pods/11731378-2c2a-448a-918f-2e2f07619ee0/volumes" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.667333 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56c867eb-6fd4-476a-8317-9f590f2ff47a" path="/var/lib/kubelet/pods/56c867eb-6fd4-476a-8317-9f590f2ff47a/volumes" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.667827 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96c8a41b-5700-46e9-bea3-aac12066069f" path="/var/lib/kubelet/pods/96c8a41b-5700-46e9-bea3-aac12066069f/volumes" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.668918 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23" path="/var/lib/kubelet/pods/9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23/volumes" Feb 28 04:17:08 crc kubenswrapper[5072]: I0228 04:17:08.669470 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c68fedb7-ccc2-4f59-8b91-7a59776ccd1d" path="/var/lib/kubelet/pods/c68fedb7-ccc2-4f59-8b91-7a59776ccd1d/volumes" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.197887 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xzs42"] Feb 28 04:17:09 crc kubenswrapper[5072]: E0228 04:17:09.198126 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96c8a41b-5700-46e9-bea3-aac12066069f" containerName="extract-utilities" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.198137 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c8a41b-5700-46e9-bea3-aac12066069f" containerName="extract-utilities" Feb 28 04:17:09 crc kubenswrapper[5072]: E0228 04:17:09.198148 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11731378-2c2a-448a-918f-2e2f07619ee0" containerName="extract-content" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.198154 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="11731378-2c2a-448a-918f-2e2f07619ee0" containerName="extract-content" Feb 28 04:17:09 crc kubenswrapper[5072]: E0228 04:17:09.198162 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96c8a41b-5700-46e9-bea3-aac12066069f" containerName="extract-content" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.198168 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c8a41b-5700-46e9-bea3-aac12066069f" containerName="extract-content" Feb 28 04:17:09 crc kubenswrapper[5072]: E0228 04:17:09.198176 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11731378-2c2a-448a-918f-2e2f07619ee0" containerName="registry-server" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.198181 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="11731378-2c2a-448a-918f-2e2f07619ee0" containerName="registry-server" Feb 28 04:17:09 crc kubenswrapper[5072]: E0228 04:17:09.198191 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23" containerName="extract-content" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.198197 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23" containerName="extract-content" Feb 28 04:17:09 crc kubenswrapper[5072]: E0228 04:17:09.198206 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11731378-2c2a-448a-918f-2e2f07619ee0" containerName="extract-utilities" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.198211 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="11731378-2c2a-448a-918f-2e2f07619ee0" containerName="extract-utilities" Feb 28 04:17:09 crc kubenswrapper[5072]: E0228 04:17:09.198222 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c68fedb7-ccc2-4f59-8b91-7a59776ccd1d" containerName="extract-utilities" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.198227 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="c68fedb7-ccc2-4f59-8b91-7a59776ccd1d" containerName="extract-utilities" Feb 28 04:17:09 crc kubenswrapper[5072]: E0228 04:17:09.198234 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23" containerName="registry-server" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.198240 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23" containerName="registry-server" Feb 28 04:17:09 crc kubenswrapper[5072]: E0228 04:17:09.198248 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c68fedb7-ccc2-4f59-8b91-7a59776ccd1d" containerName="extract-content" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.198254 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="c68fedb7-ccc2-4f59-8b91-7a59776ccd1d" containerName="extract-content" Feb 28 04:17:09 crc kubenswrapper[5072]: E0228 04:17:09.198261 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96c8a41b-5700-46e9-bea3-aac12066069f" containerName="registry-server" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.198267 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c8a41b-5700-46e9-bea3-aac12066069f" containerName="registry-server" Feb 28 04:17:09 crc kubenswrapper[5072]: E0228 04:17:09.198275 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c68fedb7-ccc2-4f59-8b91-7a59776ccd1d" containerName="registry-server" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.198281 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="c68fedb7-ccc2-4f59-8b91-7a59776ccd1d" containerName="registry-server" Feb 28 04:17:09 crc kubenswrapper[5072]: E0228 04:17:09.198288 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c867eb-6fd4-476a-8317-9f590f2ff47a" containerName="marketplace-operator" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.198294 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c867eb-6fd4-476a-8317-9f590f2ff47a" containerName="marketplace-operator" Feb 28 04:17:09 crc kubenswrapper[5072]: E0228 04:17:09.198300 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23" containerName="extract-utilities" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.198306 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23" containerName="extract-utilities" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.198394 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="11731378-2c2a-448a-918f-2e2f07619ee0" containerName="registry-server" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.198404 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="96c8a41b-5700-46e9-bea3-aac12066069f" containerName="registry-server" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.198413 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="56c867eb-6fd4-476a-8317-9f590f2ff47a" containerName="marketplace-operator" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.198424 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d7a2a01-d70d-44c8-a3dd-cd6e583d8d23" containerName="registry-server" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.198432 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="c68fedb7-ccc2-4f59-8b91-7a59776ccd1d" containerName="registry-server" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.199168 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xzs42" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.201353 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.206155 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xzs42"] Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.275272 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jxjmq" event={"ID":"cf5cb269-db5d-4b8d-ba70-a583c95dd586","Type":"ContainerStarted","Data":"8d253424f02b06a7031d7df203af487a681489d5623c556ae25b9593c540d322"} Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.275471 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-jxjmq" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.280169 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-jxjmq" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.302345 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-jxjmq" podStartSLOduration=2.302298744 podStartE2EDuration="2.302298744s" podCreationTimestamp="2026-02-28 04:17:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:17:09.289565688 +0000 UTC m=+451.284295900" watchObservedRunningTime="2026-02-28 04:17:09.302298744 +0000 UTC m=+451.297028966" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.309435 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r7jc\" (UniqueName: \"kubernetes.io/projected/16857e88-4eaa-40bb-86cb-04fd3da8babe-kube-api-access-6r7jc\") pod \"certified-operators-xzs42\" (UID: \"16857e88-4eaa-40bb-86cb-04fd3da8babe\") " pod="openshift-marketplace/certified-operators-xzs42" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.309502 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16857e88-4eaa-40bb-86cb-04fd3da8babe-catalog-content\") pod \"certified-operators-xzs42\" (UID: \"16857e88-4eaa-40bb-86cb-04fd3da8babe\") " pod="openshift-marketplace/certified-operators-xzs42" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.309593 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16857e88-4eaa-40bb-86cb-04fd3da8babe-utilities\") pod \"certified-operators-xzs42\" (UID: \"16857e88-4eaa-40bb-86cb-04fd3da8babe\") " pod="openshift-marketplace/certified-operators-xzs42" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.410972 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r7jc\" (UniqueName: \"kubernetes.io/projected/16857e88-4eaa-40bb-86cb-04fd3da8babe-kube-api-access-6r7jc\") pod \"certified-operators-xzs42\" (UID: \"16857e88-4eaa-40bb-86cb-04fd3da8babe\") " pod="openshift-marketplace/certified-operators-xzs42" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.411031 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16857e88-4eaa-40bb-86cb-04fd3da8babe-catalog-content\") pod \"certified-operators-xzs42\" (UID: \"16857e88-4eaa-40bb-86cb-04fd3da8babe\") " pod="openshift-marketplace/certified-operators-xzs42" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.411087 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16857e88-4eaa-40bb-86cb-04fd3da8babe-utilities\") pod \"certified-operators-xzs42\" (UID: \"16857e88-4eaa-40bb-86cb-04fd3da8babe\") " pod="openshift-marketplace/certified-operators-xzs42" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.411554 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16857e88-4eaa-40bb-86cb-04fd3da8babe-utilities\") pod \"certified-operators-xzs42\" (UID: \"16857e88-4eaa-40bb-86cb-04fd3da8babe\") " pod="openshift-marketplace/certified-operators-xzs42" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.411976 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16857e88-4eaa-40bb-86cb-04fd3da8babe-catalog-content\") pod \"certified-operators-xzs42\" (UID: \"16857e88-4eaa-40bb-86cb-04fd3da8babe\") " pod="openshift-marketplace/certified-operators-xzs42" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.430717 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r7jc\" (UniqueName: \"kubernetes.io/projected/16857e88-4eaa-40bb-86cb-04fd3da8babe-kube-api-access-6r7jc\") pod \"certified-operators-xzs42\" (UID: \"16857e88-4eaa-40bb-86cb-04fd3da8babe\") " pod="openshift-marketplace/certified-operators-xzs42" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.525865 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xzs42" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.798274 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9vrkw"] Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.801166 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9vrkw" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.803020 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.805295 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9vrkw"] Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.898447 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xzs42"] Feb 28 04:17:09 crc kubenswrapper[5072]: W0228 04:17:09.908551 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16857e88_4eaa_40bb_86cb_04fd3da8babe.slice/crio-10c19774222fd31b3e3f72255b125568dd3b3f8279a6c18858ab3b06df4871bb WatchSource:0}: Error finding container 10c19774222fd31b3e3f72255b125568dd3b3f8279a6c18858ab3b06df4871bb: Status 404 returned error can't find the container with id 10c19774222fd31b3e3f72255b125568dd3b3f8279a6c18858ab3b06df4871bb Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.919511 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df13e8d9-b1b8-4ed6-b16a-543fe5b71d46-catalog-content\") pod \"redhat-marketplace-9vrkw\" (UID: \"df13e8d9-b1b8-4ed6-b16a-543fe5b71d46\") " pod="openshift-marketplace/redhat-marketplace-9vrkw" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.919559 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df13e8d9-b1b8-4ed6-b16a-543fe5b71d46-utilities\") pod \"redhat-marketplace-9vrkw\" (UID: \"df13e8d9-b1b8-4ed6-b16a-543fe5b71d46\") " pod="openshift-marketplace/redhat-marketplace-9vrkw" Feb 28 04:17:09 crc kubenswrapper[5072]: I0228 04:17:09.919598 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sj7m\" (UniqueName: \"kubernetes.io/projected/df13e8d9-b1b8-4ed6-b16a-543fe5b71d46-kube-api-access-8sj7m\") pod \"redhat-marketplace-9vrkw\" (UID: \"df13e8d9-b1b8-4ed6-b16a-543fe5b71d46\") " pod="openshift-marketplace/redhat-marketplace-9vrkw" Feb 28 04:17:10 crc kubenswrapper[5072]: I0228 04:17:10.020756 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sj7m\" (UniqueName: \"kubernetes.io/projected/df13e8d9-b1b8-4ed6-b16a-543fe5b71d46-kube-api-access-8sj7m\") pod \"redhat-marketplace-9vrkw\" (UID: \"df13e8d9-b1b8-4ed6-b16a-543fe5b71d46\") " pod="openshift-marketplace/redhat-marketplace-9vrkw" Feb 28 04:17:10 crc kubenswrapper[5072]: I0228 04:17:10.020823 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df13e8d9-b1b8-4ed6-b16a-543fe5b71d46-catalog-content\") pod \"redhat-marketplace-9vrkw\" (UID: \"df13e8d9-b1b8-4ed6-b16a-543fe5b71d46\") " pod="openshift-marketplace/redhat-marketplace-9vrkw" Feb 28 04:17:10 crc kubenswrapper[5072]: I0228 04:17:10.020855 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df13e8d9-b1b8-4ed6-b16a-543fe5b71d46-utilities\") pod \"redhat-marketplace-9vrkw\" (UID: \"df13e8d9-b1b8-4ed6-b16a-543fe5b71d46\") " pod="openshift-marketplace/redhat-marketplace-9vrkw" Feb 28 04:17:10 crc kubenswrapper[5072]: I0228 04:17:10.021258 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df13e8d9-b1b8-4ed6-b16a-543fe5b71d46-utilities\") pod \"redhat-marketplace-9vrkw\" (UID: \"df13e8d9-b1b8-4ed6-b16a-543fe5b71d46\") " pod="openshift-marketplace/redhat-marketplace-9vrkw" Feb 28 04:17:10 crc kubenswrapper[5072]: I0228 04:17:10.021324 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df13e8d9-b1b8-4ed6-b16a-543fe5b71d46-catalog-content\") pod \"redhat-marketplace-9vrkw\" (UID: \"df13e8d9-b1b8-4ed6-b16a-543fe5b71d46\") " pod="openshift-marketplace/redhat-marketplace-9vrkw" Feb 28 04:17:10 crc kubenswrapper[5072]: I0228 04:17:10.038442 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sj7m\" (UniqueName: \"kubernetes.io/projected/df13e8d9-b1b8-4ed6-b16a-543fe5b71d46-kube-api-access-8sj7m\") pod \"redhat-marketplace-9vrkw\" (UID: \"df13e8d9-b1b8-4ed6-b16a-543fe5b71d46\") " pod="openshift-marketplace/redhat-marketplace-9vrkw" Feb 28 04:17:10 crc kubenswrapper[5072]: I0228 04:17:10.121679 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9vrkw" Feb 28 04:17:10 crc kubenswrapper[5072]: I0228 04:17:10.293954 5072 generic.go:334] "Generic (PLEG): container finished" podID="16857e88-4eaa-40bb-86cb-04fd3da8babe" containerID="cdc32599f446c92bd0619c9f72fbd08265748d71dcaeba40cd9682abab15fb74" exitCode=0 Feb 28 04:17:10 crc kubenswrapper[5072]: I0228 04:17:10.294128 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xzs42" event={"ID":"16857e88-4eaa-40bb-86cb-04fd3da8babe","Type":"ContainerDied","Data":"cdc32599f446c92bd0619c9f72fbd08265748d71dcaeba40cd9682abab15fb74"} Feb 28 04:17:10 crc kubenswrapper[5072]: I0228 04:17:10.295027 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xzs42" event={"ID":"16857e88-4eaa-40bb-86cb-04fd3da8babe","Type":"ContainerStarted","Data":"10c19774222fd31b3e3f72255b125568dd3b3f8279a6c18858ab3b06df4871bb"} Feb 28 04:17:10 crc kubenswrapper[5072]: I0228 04:17:10.535171 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9vrkw"] Feb 28 04:17:10 crc kubenswrapper[5072]: W0228 04:17:10.539814 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf13e8d9_b1b8_4ed6_b16a_543fe5b71d46.slice/crio-6313c6653494afe5d675503210a781d5ea7198055da45c325752191646674ce4 WatchSource:0}: Error finding container 6313c6653494afe5d675503210a781d5ea7198055da45c325752191646674ce4: Status 404 returned error can't find the container with id 6313c6653494afe5d675503210a781d5ea7198055da45c325752191646674ce4 Feb 28 04:17:11 crc kubenswrapper[5072]: I0228 04:17:11.300935 5072 generic.go:334] "Generic (PLEG): container finished" podID="16857e88-4eaa-40bb-86cb-04fd3da8babe" containerID="668b7da9bd6674a2b52d8c3cecc1846be81e7caa00e307a65444b3c326e98fbc" exitCode=0 Feb 28 04:17:11 crc kubenswrapper[5072]: I0228 04:17:11.300990 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xzs42" event={"ID":"16857e88-4eaa-40bb-86cb-04fd3da8babe","Type":"ContainerDied","Data":"668b7da9bd6674a2b52d8c3cecc1846be81e7caa00e307a65444b3c326e98fbc"} Feb 28 04:17:11 crc kubenswrapper[5072]: I0228 04:17:11.305794 5072 generic.go:334] "Generic (PLEG): container finished" podID="df13e8d9-b1b8-4ed6-b16a-543fe5b71d46" containerID="d96c9bdbaad617028473d288e1736947706a6517426f7ecda028a8faafd6b60b" exitCode=0 Feb 28 04:17:11 crc kubenswrapper[5072]: I0228 04:17:11.305877 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vrkw" event={"ID":"df13e8d9-b1b8-4ed6-b16a-543fe5b71d46","Type":"ContainerDied","Data":"d96c9bdbaad617028473d288e1736947706a6517426f7ecda028a8faafd6b60b"} Feb 28 04:17:11 crc kubenswrapper[5072]: I0228 04:17:11.305910 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vrkw" event={"ID":"df13e8d9-b1b8-4ed6-b16a-543fe5b71d46","Type":"ContainerStarted","Data":"6313c6653494afe5d675503210a781d5ea7198055da45c325752191646674ce4"} Feb 28 04:17:11 crc kubenswrapper[5072]: I0228 04:17:11.594219 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-64jqj"] Feb 28 04:17:11 crc kubenswrapper[5072]: I0228 04:17:11.596108 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-64jqj" Feb 28 04:17:11 crc kubenswrapper[5072]: I0228 04:17:11.598862 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 28 04:17:11 crc kubenswrapper[5072]: I0228 04:17:11.604508 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-64jqj"] Feb 28 04:17:11 crc kubenswrapper[5072]: I0228 04:17:11.741118 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fa349e7-fe1e-47f4-80bd-7d0e1bf55719-catalog-content\") pod \"redhat-operators-64jqj\" (UID: \"4fa349e7-fe1e-47f4-80bd-7d0e1bf55719\") " pod="openshift-marketplace/redhat-operators-64jqj" Feb 28 04:17:11 crc kubenswrapper[5072]: I0228 04:17:11.741281 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rfnw\" (UniqueName: \"kubernetes.io/projected/4fa349e7-fe1e-47f4-80bd-7d0e1bf55719-kube-api-access-5rfnw\") pod \"redhat-operators-64jqj\" (UID: \"4fa349e7-fe1e-47f4-80bd-7d0e1bf55719\") " pod="openshift-marketplace/redhat-operators-64jqj" Feb 28 04:17:11 crc kubenswrapper[5072]: I0228 04:17:11.741321 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fa349e7-fe1e-47f4-80bd-7d0e1bf55719-utilities\") pod \"redhat-operators-64jqj\" (UID: \"4fa349e7-fe1e-47f4-80bd-7d0e1bf55719\") " pod="openshift-marketplace/redhat-operators-64jqj" Feb 28 04:17:11 crc kubenswrapper[5072]: I0228 04:17:11.842063 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fa349e7-fe1e-47f4-80bd-7d0e1bf55719-utilities\") pod \"redhat-operators-64jqj\" (UID: \"4fa349e7-fe1e-47f4-80bd-7d0e1bf55719\") " pod="openshift-marketplace/redhat-operators-64jqj" Feb 28 04:17:11 crc kubenswrapper[5072]: I0228 04:17:11.842113 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fa349e7-fe1e-47f4-80bd-7d0e1bf55719-catalog-content\") pod \"redhat-operators-64jqj\" (UID: \"4fa349e7-fe1e-47f4-80bd-7d0e1bf55719\") " pod="openshift-marketplace/redhat-operators-64jqj" Feb 28 04:17:11 crc kubenswrapper[5072]: I0228 04:17:11.842176 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rfnw\" (UniqueName: \"kubernetes.io/projected/4fa349e7-fe1e-47f4-80bd-7d0e1bf55719-kube-api-access-5rfnw\") pod \"redhat-operators-64jqj\" (UID: \"4fa349e7-fe1e-47f4-80bd-7d0e1bf55719\") " pod="openshift-marketplace/redhat-operators-64jqj" Feb 28 04:17:11 crc kubenswrapper[5072]: I0228 04:17:11.842675 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fa349e7-fe1e-47f4-80bd-7d0e1bf55719-utilities\") pod \"redhat-operators-64jqj\" (UID: \"4fa349e7-fe1e-47f4-80bd-7d0e1bf55719\") " pod="openshift-marketplace/redhat-operators-64jqj" Feb 28 04:17:11 crc kubenswrapper[5072]: I0228 04:17:11.842745 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fa349e7-fe1e-47f4-80bd-7d0e1bf55719-catalog-content\") pod \"redhat-operators-64jqj\" (UID: \"4fa349e7-fe1e-47f4-80bd-7d0e1bf55719\") " pod="openshift-marketplace/redhat-operators-64jqj" Feb 28 04:17:11 crc kubenswrapper[5072]: I0228 04:17:11.868886 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rfnw\" (UniqueName: \"kubernetes.io/projected/4fa349e7-fe1e-47f4-80bd-7d0e1bf55719-kube-api-access-5rfnw\") pod \"redhat-operators-64jqj\" (UID: \"4fa349e7-fe1e-47f4-80bd-7d0e1bf55719\") " pod="openshift-marketplace/redhat-operators-64jqj" Feb 28 04:17:11 crc kubenswrapper[5072]: I0228 04:17:11.926149 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-64jqj" Feb 28 04:17:12 crc kubenswrapper[5072]: I0228 04:17:12.192306 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6qf7q"] Feb 28 04:17:12 crc kubenswrapper[5072]: I0228 04:17:12.193781 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6qf7q" Feb 28 04:17:12 crc kubenswrapper[5072]: I0228 04:17:12.195602 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 28 04:17:12 crc kubenswrapper[5072]: I0228 04:17:12.205903 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6qf7q"] Feb 28 04:17:12 crc kubenswrapper[5072]: I0228 04:17:12.313308 5072 generic.go:334] "Generic (PLEG): container finished" podID="df13e8d9-b1b8-4ed6-b16a-543fe5b71d46" containerID="06649faf8c5d3cdb7ee4c6c71440140624ea47fba5a59c0654b00eefb86cfa21" exitCode=0 Feb 28 04:17:12 crc kubenswrapper[5072]: I0228 04:17:12.313393 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vrkw" event={"ID":"df13e8d9-b1b8-4ed6-b16a-543fe5b71d46","Type":"ContainerDied","Data":"06649faf8c5d3cdb7ee4c6c71440140624ea47fba5a59c0654b00eefb86cfa21"} Feb 28 04:17:12 crc kubenswrapper[5072]: I0228 04:17:12.315130 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xzs42" event={"ID":"16857e88-4eaa-40bb-86cb-04fd3da8babe","Type":"ContainerStarted","Data":"45508af649f1ce87dda1f7aed054d2a0ee71d52803f64832af3f54acd3ad29af"} Feb 28 04:17:12 crc kubenswrapper[5072]: I0228 04:17:12.317976 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-64jqj"] Feb 28 04:17:12 crc kubenswrapper[5072]: I0228 04:17:12.356432 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dbe0794-9375-4056-98bf-7ae9f9f10093-utilities\") pod \"community-operators-6qf7q\" (UID: \"6dbe0794-9375-4056-98bf-7ae9f9f10093\") " pod="openshift-marketplace/community-operators-6qf7q" Feb 28 04:17:12 crc kubenswrapper[5072]: I0228 04:17:12.356708 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dbe0794-9375-4056-98bf-7ae9f9f10093-catalog-content\") pod \"community-operators-6qf7q\" (UID: \"6dbe0794-9375-4056-98bf-7ae9f9f10093\") " pod="openshift-marketplace/community-operators-6qf7q" Feb 28 04:17:12 crc kubenswrapper[5072]: I0228 04:17:12.356807 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhmrc\" (UniqueName: \"kubernetes.io/projected/6dbe0794-9375-4056-98bf-7ae9f9f10093-kube-api-access-xhmrc\") pod \"community-operators-6qf7q\" (UID: \"6dbe0794-9375-4056-98bf-7ae9f9f10093\") " pod="openshift-marketplace/community-operators-6qf7q" Feb 28 04:17:12 crc kubenswrapper[5072]: I0228 04:17:12.357338 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xzs42" podStartSLOduration=1.971343236 podStartE2EDuration="3.357317401s" podCreationTimestamp="2026-02-28 04:17:09 +0000 UTC" firstStartedPulling="2026-02-28 04:17:10.296547814 +0000 UTC m=+452.291278006" lastFinishedPulling="2026-02-28 04:17:11.682521989 +0000 UTC m=+453.677252171" observedRunningTime="2026-02-28 04:17:12.356206266 +0000 UTC m=+454.350936468" watchObservedRunningTime="2026-02-28 04:17:12.357317401 +0000 UTC m=+454.352047593" Feb 28 04:17:12 crc kubenswrapper[5072]: I0228 04:17:12.458256 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dbe0794-9375-4056-98bf-7ae9f9f10093-utilities\") pod \"community-operators-6qf7q\" (UID: \"6dbe0794-9375-4056-98bf-7ae9f9f10093\") " pod="openshift-marketplace/community-operators-6qf7q" Feb 28 04:17:12 crc kubenswrapper[5072]: I0228 04:17:12.458381 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dbe0794-9375-4056-98bf-7ae9f9f10093-catalog-content\") pod \"community-operators-6qf7q\" (UID: \"6dbe0794-9375-4056-98bf-7ae9f9f10093\") " pod="openshift-marketplace/community-operators-6qf7q" Feb 28 04:17:12 crc kubenswrapper[5072]: I0228 04:17:12.458454 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhmrc\" (UniqueName: \"kubernetes.io/projected/6dbe0794-9375-4056-98bf-7ae9f9f10093-kube-api-access-xhmrc\") pod \"community-operators-6qf7q\" (UID: \"6dbe0794-9375-4056-98bf-7ae9f9f10093\") " pod="openshift-marketplace/community-operators-6qf7q" Feb 28 04:17:12 crc kubenswrapper[5072]: I0228 04:17:12.458803 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dbe0794-9375-4056-98bf-7ae9f9f10093-utilities\") pod \"community-operators-6qf7q\" (UID: \"6dbe0794-9375-4056-98bf-7ae9f9f10093\") " pod="openshift-marketplace/community-operators-6qf7q" Feb 28 04:17:12 crc kubenswrapper[5072]: I0228 04:17:12.460403 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dbe0794-9375-4056-98bf-7ae9f9f10093-catalog-content\") pod \"community-operators-6qf7q\" (UID: \"6dbe0794-9375-4056-98bf-7ae9f9f10093\") " pod="openshift-marketplace/community-operators-6qf7q" Feb 28 04:17:12 crc kubenswrapper[5072]: I0228 04:17:12.482490 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhmrc\" (UniqueName: \"kubernetes.io/projected/6dbe0794-9375-4056-98bf-7ae9f9f10093-kube-api-access-xhmrc\") pod \"community-operators-6qf7q\" (UID: \"6dbe0794-9375-4056-98bf-7ae9f9f10093\") " pod="openshift-marketplace/community-operators-6qf7q" Feb 28 04:17:12 crc kubenswrapper[5072]: I0228 04:17:12.507453 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6qf7q" Feb 28 04:17:12 crc kubenswrapper[5072]: I0228 04:17:12.888635 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6qf7q"] Feb 28 04:17:12 crc kubenswrapper[5072]: W0228 04:17:12.892946 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dbe0794_9375_4056_98bf_7ae9f9f10093.slice/crio-ee7fabd7b6a8058f3ffd908a878e6c7d8edac7c4c6569b8bb46842c21287185c WatchSource:0}: Error finding container ee7fabd7b6a8058f3ffd908a878e6c7d8edac7c4c6569b8bb46842c21287185c: Status 404 returned error can't find the container with id ee7fabd7b6a8058f3ffd908a878e6c7d8edac7c4c6569b8bb46842c21287185c Feb 28 04:17:13 crc kubenswrapper[5072]: I0228 04:17:13.322518 5072 generic.go:334] "Generic (PLEG): container finished" podID="6dbe0794-9375-4056-98bf-7ae9f9f10093" containerID="b93a2a5b1053cbba113edf5a9082e6a5c7c231b4cd2e2921dab01b735af8433b" exitCode=0 Feb 28 04:17:13 crc kubenswrapper[5072]: I0228 04:17:13.322616 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qf7q" event={"ID":"6dbe0794-9375-4056-98bf-7ae9f9f10093","Type":"ContainerDied","Data":"b93a2a5b1053cbba113edf5a9082e6a5c7c231b4cd2e2921dab01b735af8433b"} Feb 28 04:17:13 crc kubenswrapper[5072]: I0228 04:17:13.322663 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qf7q" event={"ID":"6dbe0794-9375-4056-98bf-7ae9f9f10093","Type":"ContainerStarted","Data":"ee7fabd7b6a8058f3ffd908a878e6c7d8edac7c4c6569b8bb46842c21287185c"} Feb 28 04:17:13 crc kubenswrapper[5072]: I0228 04:17:13.325036 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vrkw" event={"ID":"df13e8d9-b1b8-4ed6-b16a-543fe5b71d46","Type":"ContainerStarted","Data":"3f512e8f326426692d8328b095280ac67bd74508756e56bbab76ec265322181d"} Feb 28 04:17:13 crc kubenswrapper[5072]: I0228 04:17:13.326383 5072 generic.go:334] "Generic (PLEG): container finished" podID="4fa349e7-fe1e-47f4-80bd-7d0e1bf55719" containerID="483864d6b9b640d67082990b92dbf47eba7d584c064e4c2321e67c77dd67b823" exitCode=0 Feb 28 04:17:13 crc kubenswrapper[5072]: I0228 04:17:13.326465 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64jqj" event={"ID":"4fa349e7-fe1e-47f4-80bd-7d0e1bf55719","Type":"ContainerDied","Data":"483864d6b9b640d67082990b92dbf47eba7d584c064e4c2321e67c77dd67b823"} Feb 28 04:17:13 crc kubenswrapper[5072]: I0228 04:17:13.326494 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64jqj" event={"ID":"4fa349e7-fe1e-47f4-80bd-7d0e1bf55719","Type":"ContainerStarted","Data":"6bc9d50a6a611a60c6e52d99bbf2be3bea8023390a2229db3e66ef4ab640c7a9"} Feb 28 04:17:13 crc kubenswrapper[5072]: I0228 04:17:13.358528 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9vrkw" podStartSLOduration=2.972370395 podStartE2EDuration="4.358507746s" podCreationTimestamp="2026-02-28 04:17:09 +0000 UTC" firstStartedPulling="2026-02-28 04:17:11.306995837 +0000 UTC m=+453.301726029" lastFinishedPulling="2026-02-28 04:17:12.693133188 +0000 UTC m=+454.687863380" observedRunningTime="2026-02-28 04:17:13.355429551 +0000 UTC m=+455.350159753" watchObservedRunningTime="2026-02-28 04:17:13.358507746 +0000 UTC m=+455.353237938" Feb 28 04:17:14 crc kubenswrapper[5072]: I0228 04:17:14.335198 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64jqj" event={"ID":"4fa349e7-fe1e-47f4-80bd-7d0e1bf55719","Type":"ContainerStarted","Data":"eac583159287bcaad73fd1b765c6151909aaf2d9f35ab85d60d0202d999c546f"} Feb 28 04:17:14 crc kubenswrapper[5072]: I0228 04:17:14.337170 5072 generic.go:334] "Generic (PLEG): container finished" podID="6dbe0794-9375-4056-98bf-7ae9f9f10093" containerID="6d0de723f56c1f527e0ed13b1e331e882f43b6aeeadc3fa575f8487a79280455" exitCode=0 Feb 28 04:17:14 crc kubenswrapper[5072]: I0228 04:17:14.338125 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qf7q" event={"ID":"6dbe0794-9375-4056-98bf-7ae9f9f10093","Type":"ContainerDied","Data":"6d0de723f56c1f527e0ed13b1e331e882f43b6aeeadc3fa575f8487a79280455"} Feb 28 04:17:15 crc kubenswrapper[5072]: I0228 04:17:15.345450 5072 generic.go:334] "Generic (PLEG): container finished" podID="4fa349e7-fe1e-47f4-80bd-7d0e1bf55719" containerID="eac583159287bcaad73fd1b765c6151909aaf2d9f35ab85d60d0202d999c546f" exitCode=0 Feb 28 04:17:15 crc kubenswrapper[5072]: I0228 04:17:15.345523 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64jqj" event={"ID":"4fa349e7-fe1e-47f4-80bd-7d0e1bf55719","Type":"ContainerDied","Data":"eac583159287bcaad73fd1b765c6151909aaf2d9f35ab85d60d0202d999c546f"} Feb 28 04:17:15 crc kubenswrapper[5072]: I0228 04:17:15.348869 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qf7q" event={"ID":"6dbe0794-9375-4056-98bf-7ae9f9f10093","Type":"ContainerStarted","Data":"780ac01c32bebc9c633364f141ae465c454d7c6ac668733d6e1664027728b900"} Feb 28 04:17:15 crc kubenswrapper[5072]: I0228 04:17:15.380240 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6qf7q" podStartSLOduration=1.778483362 podStartE2EDuration="3.380221449s" podCreationTimestamp="2026-02-28 04:17:12 +0000 UTC" firstStartedPulling="2026-02-28 04:17:13.323839178 +0000 UTC m=+455.318569370" lastFinishedPulling="2026-02-28 04:17:14.925577265 +0000 UTC m=+456.920307457" observedRunningTime="2026-02-28 04:17:15.378416983 +0000 UTC m=+457.373147205" watchObservedRunningTime="2026-02-28 04:17:15.380221449 +0000 UTC m=+457.374951651" Feb 28 04:17:16 crc kubenswrapper[5072]: I0228 04:17:16.355404 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64jqj" event={"ID":"4fa349e7-fe1e-47f4-80bd-7d0e1bf55719","Type":"ContainerStarted","Data":"c76ce4f81b574d4fa81a2158594b21fea9bc98ceb2b99c864e9a8255d4695157"} Feb 28 04:17:16 crc kubenswrapper[5072]: I0228 04:17:16.372045 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-64jqj" podStartSLOduration=2.9908477380000003 podStartE2EDuration="5.372029253s" podCreationTimestamp="2026-02-28 04:17:11 +0000 UTC" firstStartedPulling="2026-02-28 04:17:13.327494992 +0000 UTC m=+455.322225184" lastFinishedPulling="2026-02-28 04:17:15.708676507 +0000 UTC m=+457.703406699" observedRunningTime="2026-02-28 04:17:16.370285108 +0000 UTC m=+458.365015300" watchObservedRunningTime="2026-02-28 04:17:16.372029253 +0000 UTC m=+458.366759435" Feb 28 04:17:16 crc kubenswrapper[5072]: I0228 04:17:16.786193 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-ft9tz" Feb 28 04:17:16 crc kubenswrapper[5072]: I0228 04:17:16.841509 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bw85j"] Feb 28 04:17:19 crc kubenswrapper[5072]: I0228 04:17:19.526292 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xzs42" Feb 28 04:17:19 crc kubenswrapper[5072]: I0228 04:17:19.526775 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xzs42" Feb 28 04:17:19 crc kubenswrapper[5072]: I0228 04:17:19.568632 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xzs42" Feb 28 04:17:20 crc kubenswrapper[5072]: I0228 04:17:20.105749 5072 patch_prober.go:28] interesting pod/machine-config-daemon-5lrpf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:17:20 crc kubenswrapper[5072]: I0228 04:17:20.105806 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:17:20 crc kubenswrapper[5072]: I0228 04:17:20.122686 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9vrkw" Feb 28 04:17:20 crc kubenswrapper[5072]: I0228 04:17:20.123001 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9vrkw" Feb 28 04:17:20 crc kubenswrapper[5072]: I0228 04:17:20.161054 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9vrkw" Feb 28 04:17:20 crc kubenswrapper[5072]: I0228 04:17:20.418414 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xzs42" Feb 28 04:17:20 crc kubenswrapper[5072]: I0228 04:17:20.419991 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9vrkw" Feb 28 04:17:21 crc kubenswrapper[5072]: I0228 04:17:21.926709 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-64jqj" Feb 28 04:17:21 crc kubenswrapper[5072]: I0228 04:17:21.927109 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-64jqj" Feb 28 04:17:21 crc kubenswrapper[5072]: I0228 04:17:21.976813 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-64jqj" Feb 28 04:17:22 crc kubenswrapper[5072]: I0228 04:17:22.418104 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-64jqj" Feb 28 04:17:22 crc kubenswrapper[5072]: I0228 04:17:22.507669 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6qf7q" Feb 28 04:17:22 crc kubenswrapper[5072]: I0228 04:17:22.507720 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6qf7q" Feb 28 04:17:22 crc kubenswrapper[5072]: I0228 04:17:22.540592 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6qf7q" Feb 28 04:17:23 crc kubenswrapper[5072]: I0228 04:17:23.436261 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6qf7q" Feb 28 04:17:41 crc kubenswrapper[5072]: I0228 04:17:41.903340 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" podUID="3b94a919-0f97-48a8-aac9-4f52655d572d" containerName="registry" containerID="cri-o://0b250a39bf5d029d6a56707f786f62b6097b43073dfe972869b95a2ecae80fd1" gracePeriod=30 Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.334550 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.420271 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4k4q\" (UniqueName: \"kubernetes.io/projected/3b94a919-0f97-48a8-aac9-4f52655d572d-kube-api-access-w4k4q\") pod \"3b94a919-0f97-48a8-aac9-4f52655d572d\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.420713 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3b94a919-0f97-48a8-aac9-4f52655d572d-trusted-ca\") pod \"3b94a919-0f97-48a8-aac9-4f52655d572d\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.420918 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"3b94a919-0f97-48a8-aac9-4f52655d572d\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.420959 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3b94a919-0f97-48a8-aac9-4f52655d572d-installation-pull-secrets\") pod \"3b94a919-0f97-48a8-aac9-4f52655d572d\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.421032 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b94a919-0f97-48a8-aac9-4f52655d572d-registry-tls\") pod \"3b94a919-0f97-48a8-aac9-4f52655d572d\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.421061 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3b94a919-0f97-48a8-aac9-4f52655d572d-registry-certificates\") pod \"3b94a919-0f97-48a8-aac9-4f52655d572d\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.421119 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3b94a919-0f97-48a8-aac9-4f52655d572d-ca-trust-extracted\") pod \"3b94a919-0f97-48a8-aac9-4f52655d572d\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.421154 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b94a919-0f97-48a8-aac9-4f52655d572d-bound-sa-token\") pod \"3b94a919-0f97-48a8-aac9-4f52655d572d\" (UID: \"3b94a919-0f97-48a8-aac9-4f52655d572d\") " Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.422101 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b94a919-0f97-48a8-aac9-4f52655d572d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "3b94a919-0f97-48a8-aac9-4f52655d572d" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.422255 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b94a919-0f97-48a8-aac9-4f52655d572d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "3b94a919-0f97-48a8-aac9-4f52655d572d" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.428708 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b94a919-0f97-48a8-aac9-4f52655d572d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "3b94a919-0f97-48a8-aac9-4f52655d572d" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.429281 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b94a919-0f97-48a8-aac9-4f52655d572d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "3b94a919-0f97-48a8-aac9-4f52655d572d" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.429684 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b94a919-0f97-48a8-aac9-4f52655d572d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "3b94a919-0f97-48a8-aac9-4f52655d572d" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.434108 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "3b94a919-0f97-48a8-aac9-4f52655d572d" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.439352 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b94a919-0f97-48a8-aac9-4f52655d572d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "3b94a919-0f97-48a8-aac9-4f52655d572d" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.440522 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b94a919-0f97-48a8-aac9-4f52655d572d-kube-api-access-w4k4q" (OuterVolumeSpecName: "kube-api-access-w4k4q") pod "3b94a919-0f97-48a8-aac9-4f52655d572d" (UID: "3b94a919-0f97-48a8-aac9-4f52655d572d"). InnerVolumeSpecName "kube-api-access-w4k4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.506325 5072 generic.go:334] "Generic (PLEG): container finished" podID="3b94a919-0f97-48a8-aac9-4f52655d572d" containerID="0b250a39bf5d029d6a56707f786f62b6097b43073dfe972869b95a2ecae80fd1" exitCode=0 Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.506379 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" event={"ID":"3b94a919-0f97-48a8-aac9-4f52655d572d","Type":"ContainerDied","Data":"0b250a39bf5d029d6a56707f786f62b6097b43073dfe972869b95a2ecae80fd1"} Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.506410 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" event={"ID":"3b94a919-0f97-48a8-aac9-4f52655d572d","Type":"ContainerDied","Data":"3da31d7e10c6625b7b409ddef629c06429c6527356b29682cdf6571a6986dc5a"} Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.506429 5072 scope.go:117] "RemoveContainer" containerID="0b250a39bf5d029d6a56707f786f62b6097b43073dfe972869b95a2ecae80fd1" Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.506442 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bw85j" Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.526171 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4k4q\" (UniqueName: \"kubernetes.io/projected/3b94a919-0f97-48a8-aac9-4f52655d572d-kube-api-access-w4k4q\") on node \"crc\" DevicePath \"\"" Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.526214 5072 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3b94a919-0f97-48a8-aac9-4f52655d572d-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.526227 5072 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3b94a919-0f97-48a8-aac9-4f52655d572d-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.526237 5072 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b94a919-0f97-48a8-aac9-4f52655d572d-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.526247 5072 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3b94a919-0f97-48a8-aac9-4f52655d572d-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.526255 5072 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3b94a919-0f97-48a8-aac9-4f52655d572d-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.526264 5072 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b94a919-0f97-48a8-aac9-4f52655d572d-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.540244 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bw85j"] Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.545956 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bw85j"] Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.546288 5072 scope.go:117] "RemoveContainer" containerID="0b250a39bf5d029d6a56707f786f62b6097b43073dfe972869b95a2ecae80fd1" Feb 28 04:17:42 crc kubenswrapper[5072]: E0228 04:17:42.548119 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b250a39bf5d029d6a56707f786f62b6097b43073dfe972869b95a2ecae80fd1\": container with ID starting with 0b250a39bf5d029d6a56707f786f62b6097b43073dfe972869b95a2ecae80fd1 not found: ID does not exist" containerID="0b250a39bf5d029d6a56707f786f62b6097b43073dfe972869b95a2ecae80fd1" Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.548178 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b250a39bf5d029d6a56707f786f62b6097b43073dfe972869b95a2ecae80fd1"} err="failed to get container status \"0b250a39bf5d029d6a56707f786f62b6097b43073dfe972869b95a2ecae80fd1\": rpc error: code = NotFound desc = could not find container \"0b250a39bf5d029d6a56707f786f62b6097b43073dfe972869b95a2ecae80fd1\": container with ID starting with 0b250a39bf5d029d6a56707f786f62b6097b43073dfe972869b95a2ecae80fd1 not found: ID does not exist" Feb 28 04:17:42 crc kubenswrapper[5072]: I0228 04:17:42.667500 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b94a919-0f97-48a8-aac9-4f52655d572d" path="/var/lib/kubelet/pods/3b94a919-0f97-48a8-aac9-4f52655d572d/volumes" Feb 28 04:17:50 crc kubenswrapper[5072]: I0228 04:17:50.106463 5072 patch_prober.go:28] interesting pod/machine-config-daemon-5lrpf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:17:50 crc kubenswrapper[5072]: I0228 04:17:50.107266 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:17:50 crc kubenswrapper[5072]: I0228 04:17:50.107357 5072 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" Feb 28 04:17:50 crc kubenswrapper[5072]: I0228 04:17:50.108536 5072 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"41cc17a140060c4cb6a238bd77382a9cc2f3dd6470af9dc5b7a487f87ffd0e35"} pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 28 04:17:50 crc kubenswrapper[5072]: I0228 04:17:50.108691 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" containerID="cri-o://41cc17a140060c4cb6a238bd77382a9cc2f3dd6470af9dc5b7a487f87ffd0e35" gracePeriod=600 Feb 28 04:17:50 crc kubenswrapper[5072]: I0228 04:17:50.550186 5072 generic.go:334] "Generic (PLEG): container finished" podID="a035bbab-1d8f-4120-aaf7-88984d936939" containerID="41cc17a140060c4cb6a238bd77382a9cc2f3dd6470af9dc5b7a487f87ffd0e35" exitCode=0 Feb 28 04:17:50 crc kubenswrapper[5072]: I0228 04:17:50.550263 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" event={"ID":"a035bbab-1d8f-4120-aaf7-88984d936939","Type":"ContainerDied","Data":"41cc17a140060c4cb6a238bd77382a9cc2f3dd6470af9dc5b7a487f87ffd0e35"} Feb 28 04:17:50 crc kubenswrapper[5072]: I0228 04:17:50.550514 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" event={"ID":"a035bbab-1d8f-4120-aaf7-88984d936939","Type":"ContainerStarted","Data":"4995e81e3b0a747e3fcdeca516d170f1d3f4a30b7f4d30dcd3a95695e36c9e2c"} Feb 28 04:17:50 crc kubenswrapper[5072]: I0228 04:17:50.550547 5072 scope.go:117] "RemoveContainer" containerID="719ce9cd9b92ee46887e50c3c1fe6fe218968b6e0c7c1b3e3916ed1d3f7cc26d" Feb 28 04:18:00 crc kubenswrapper[5072]: I0228 04:18:00.135397 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537538-6ldpp"] Feb 28 04:18:00 crc kubenswrapper[5072]: E0228 04:18:00.136111 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b94a919-0f97-48a8-aac9-4f52655d572d" containerName="registry" Feb 28 04:18:00 crc kubenswrapper[5072]: I0228 04:18:00.136124 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b94a919-0f97-48a8-aac9-4f52655d572d" containerName="registry" Feb 28 04:18:00 crc kubenswrapper[5072]: I0228 04:18:00.136273 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b94a919-0f97-48a8-aac9-4f52655d572d" containerName="registry" Feb 28 04:18:00 crc kubenswrapper[5072]: I0228 04:18:00.136868 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537538-6ldpp" Feb 28 04:18:00 crc kubenswrapper[5072]: I0228 04:18:00.139527 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-c8kvx" Feb 28 04:18:00 crc kubenswrapper[5072]: I0228 04:18:00.139873 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 04:18:00 crc kubenswrapper[5072]: I0228 04:18:00.142759 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 04:18:00 crc kubenswrapper[5072]: I0228 04:18:00.149815 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537538-6ldpp"] Feb 28 04:18:00 crc kubenswrapper[5072]: I0228 04:18:00.250260 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qcwv\" (UniqueName: \"kubernetes.io/projected/6f9dfb83-81a2-4c25-b386-840173e76451-kube-api-access-9qcwv\") pod \"auto-csr-approver-29537538-6ldpp\" (UID: \"6f9dfb83-81a2-4c25-b386-840173e76451\") " pod="openshift-infra/auto-csr-approver-29537538-6ldpp" Feb 28 04:18:00 crc kubenswrapper[5072]: I0228 04:18:00.351054 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qcwv\" (UniqueName: \"kubernetes.io/projected/6f9dfb83-81a2-4c25-b386-840173e76451-kube-api-access-9qcwv\") pod \"auto-csr-approver-29537538-6ldpp\" (UID: \"6f9dfb83-81a2-4c25-b386-840173e76451\") " pod="openshift-infra/auto-csr-approver-29537538-6ldpp" Feb 28 04:18:00 crc kubenswrapper[5072]: I0228 04:18:00.373073 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qcwv\" (UniqueName: \"kubernetes.io/projected/6f9dfb83-81a2-4c25-b386-840173e76451-kube-api-access-9qcwv\") pod \"auto-csr-approver-29537538-6ldpp\" (UID: \"6f9dfb83-81a2-4c25-b386-840173e76451\") " pod="openshift-infra/auto-csr-approver-29537538-6ldpp" Feb 28 04:18:00 crc kubenswrapper[5072]: I0228 04:18:00.471492 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537538-6ldpp" Feb 28 04:18:00 crc kubenswrapper[5072]: I0228 04:18:00.692854 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537538-6ldpp"] Feb 28 04:18:00 crc kubenswrapper[5072]: W0228 04:18:00.704528 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f9dfb83_81a2_4c25_b386_840173e76451.slice/crio-fb0d2326638d42e0b545a3356c5b97a70da000c50f29ccb5d938308d7590afbe WatchSource:0}: Error finding container fb0d2326638d42e0b545a3356c5b97a70da000c50f29ccb5d938308d7590afbe: Status 404 returned error can't find the container with id fb0d2326638d42e0b545a3356c5b97a70da000c50f29ccb5d938308d7590afbe Feb 28 04:18:01 crc kubenswrapper[5072]: I0228 04:18:01.610416 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537538-6ldpp" event={"ID":"6f9dfb83-81a2-4c25-b386-840173e76451","Type":"ContainerStarted","Data":"fb0d2326638d42e0b545a3356c5b97a70da000c50f29ccb5d938308d7590afbe"} Feb 28 04:18:02 crc kubenswrapper[5072]: I0228 04:18:02.621869 5072 generic.go:334] "Generic (PLEG): container finished" podID="6f9dfb83-81a2-4c25-b386-840173e76451" containerID="ebc7329fd199392620135b9178139adfb3e4dd155c2803ce4f8a0ed8ffc19df3" exitCode=0 Feb 28 04:18:02 crc kubenswrapper[5072]: I0228 04:18:02.622053 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537538-6ldpp" event={"ID":"6f9dfb83-81a2-4c25-b386-840173e76451","Type":"ContainerDied","Data":"ebc7329fd199392620135b9178139adfb3e4dd155c2803ce4f8a0ed8ffc19df3"} Feb 28 04:18:03 crc kubenswrapper[5072]: I0228 04:18:03.931284 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537538-6ldpp" Feb 28 04:18:03 crc kubenswrapper[5072]: I0228 04:18:03.995117 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qcwv\" (UniqueName: \"kubernetes.io/projected/6f9dfb83-81a2-4c25-b386-840173e76451-kube-api-access-9qcwv\") pod \"6f9dfb83-81a2-4c25-b386-840173e76451\" (UID: \"6f9dfb83-81a2-4c25-b386-840173e76451\") " Feb 28 04:18:04 crc kubenswrapper[5072]: I0228 04:18:04.000459 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f9dfb83-81a2-4c25-b386-840173e76451-kube-api-access-9qcwv" (OuterVolumeSpecName: "kube-api-access-9qcwv") pod "6f9dfb83-81a2-4c25-b386-840173e76451" (UID: "6f9dfb83-81a2-4c25-b386-840173e76451"). InnerVolumeSpecName "kube-api-access-9qcwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:18:04 crc kubenswrapper[5072]: I0228 04:18:04.096299 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qcwv\" (UniqueName: \"kubernetes.io/projected/6f9dfb83-81a2-4c25-b386-840173e76451-kube-api-access-9qcwv\") on node \"crc\" DevicePath \"\"" Feb 28 04:18:04 crc kubenswrapper[5072]: I0228 04:18:04.635862 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537538-6ldpp" event={"ID":"6f9dfb83-81a2-4c25-b386-840173e76451","Type":"ContainerDied","Data":"fb0d2326638d42e0b545a3356c5b97a70da000c50f29ccb5d938308d7590afbe"} Feb 28 04:18:04 crc kubenswrapper[5072]: I0228 04:18:04.635903 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb0d2326638d42e0b545a3356c5b97a70da000c50f29ccb5d938308d7590afbe" Feb 28 04:18:04 crc kubenswrapper[5072]: I0228 04:18:04.636341 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537538-6ldpp" Feb 28 04:18:04 crc kubenswrapper[5072]: I0228 04:18:04.989899 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537532-qwbxr"] Feb 28 04:18:04 crc kubenswrapper[5072]: I0228 04:18:04.994099 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537532-qwbxr"] Feb 28 04:18:06 crc kubenswrapper[5072]: I0228 04:18:06.666723 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5d29ab8-044b-4fc5-b5eb-02c5ac608dac" path="/var/lib/kubelet/pods/c5d29ab8-044b-4fc5-b5eb-02c5ac608dac/volumes" Feb 28 04:19:50 crc kubenswrapper[5072]: I0228 04:19:50.106181 5072 patch_prober.go:28] interesting pod/machine-config-daemon-5lrpf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:19:50 crc kubenswrapper[5072]: I0228 04:19:50.107374 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:20:00 crc kubenswrapper[5072]: I0228 04:20:00.134089 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537540-rcqfw"] Feb 28 04:20:00 crc kubenswrapper[5072]: E0228 04:20:00.134690 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9dfb83-81a2-4c25-b386-840173e76451" containerName="oc" Feb 28 04:20:00 crc kubenswrapper[5072]: I0228 04:20:00.134706 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9dfb83-81a2-4c25-b386-840173e76451" containerName="oc" Feb 28 04:20:00 crc kubenswrapper[5072]: I0228 04:20:00.134807 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9dfb83-81a2-4c25-b386-840173e76451" containerName="oc" Feb 28 04:20:00 crc kubenswrapper[5072]: I0228 04:20:00.135152 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537540-rcqfw" Feb 28 04:20:00 crc kubenswrapper[5072]: I0228 04:20:00.137588 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 04:20:00 crc kubenswrapper[5072]: I0228 04:20:00.137882 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-c8kvx" Feb 28 04:20:00 crc kubenswrapper[5072]: I0228 04:20:00.138138 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 04:20:00 crc kubenswrapper[5072]: I0228 04:20:00.141387 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537540-rcqfw"] Feb 28 04:20:00 crc kubenswrapper[5072]: I0228 04:20:00.237841 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdgcn\" (UniqueName: \"kubernetes.io/projected/e7aa2b36-997c-4e7e-b869-a116e9e9fd74-kube-api-access-kdgcn\") pod \"auto-csr-approver-29537540-rcqfw\" (UID: \"e7aa2b36-997c-4e7e-b869-a116e9e9fd74\") " pod="openshift-infra/auto-csr-approver-29537540-rcqfw" Feb 28 04:20:00 crc kubenswrapper[5072]: I0228 04:20:00.339056 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdgcn\" (UniqueName: \"kubernetes.io/projected/e7aa2b36-997c-4e7e-b869-a116e9e9fd74-kube-api-access-kdgcn\") pod \"auto-csr-approver-29537540-rcqfw\" (UID: \"e7aa2b36-997c-4e7e-b869-a116e9e9fd74\") " pod="openshift-infra/auto-csr-approver-29537540-rcqfw" Feb 28 04:20:00 crc kubenswrapper[5072]: I0228 04:20:00.356427 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdgcn\" (UniqueName: \"kubernetes.io/projected/e7aa2b36-997c-4e7e-b869-a116e9e9fd74-kube-api-access-kdgcn\") pod \"auto-csr-approver-29537540-rcqfw\" (UID: \"e7aa2b36-997c-4e7e-b869-a116e9e9fd74\") " pod="openshift-infra/auto-csr-approver-29537540-rcqfw" Feb 28 04:20:00 crc kubenswrapper[5072]: I0228 04:20:00.498921 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537540-rcqfw" Feb 28 04:20:00 crc kubenswrapper[5072]: I0228 04:20:00.879817 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537540-rcqfw"] Feb 28 04:20:00 crc kubenswrapper[5072]: I0228 04:20:00.894218 5072 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 28 04:20:01 crc kubenswrapper[5072]: I0228 04:20:01.277782 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537540-rcqfw" event={"ID":"e7aa2b36-997c-4e7e-b869-a116e9e9fd74","Type":"ContainerStarted","Data":"31b964b4151283cbdcb1131e5a99e5651fd8f2d18e9bb561abb96bdfc614d64a"} Feb 28 04:20:02 crc kubenswrapper[5072]: I0228 04:20:02.283937 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537540-rcqfw" event={"ID":"e7aa2b36-997c-4e7e-b869-a116e9e9fd74","Type":"ContainerStarted","Data":"8ff26ba1f6c190544340eaba8e2adc86958af681c78d833fb1ede91de0c7b797"} Feb 28 04:20:02 crc kubenswrapper[5072]: I0228 04:20:02.303448 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29537540-rcqfw" podStartSLOduration=1.3038831530000001 podStartE2EDuration="2.30342759s" podCreationTimestamp="2026-02-28 04:20:00 +0000 UTC" firstStartedPulling="2026-02-28 04:20:00.894037981 +0000 UTC m=+622.888768173" lastFinishedPulling="2026-02-28 04:20:01.893582378 +0000 UTC m=+623.888312610" observedRunningTime="2026-02-28 04:20:02.301839741 +0000 UTC m=+624.296569933" watchObservedRunningTime="2026-02-28 04:20:02.30342759 +0000 UTC m=+624.298157792" Feb 28 04:20:03 crc kubenswrapper[5072]: I0228 04:20:03.292020 5072 generic.go:334] "Generic (PLEG): container finished" podID="e7aa2b36-997c-4e7e-b869-a116e9e9fd74" containerID="8ff26ba1f6c190544340eaba8e2adc86958af681c78d833fb1ede91de0c7b797" exitCode=0 Feb 28 04:20:03 crc kubenswrapper[5072]: I0228 04:20:03.292084 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537540-rcqfw" event={"ID":"e7aa2b36-997c-4e7e-b869-a116e9e9fd74","Type":"ContainerDied","Data":"8ff26ba1f6c190544340eaba8e2adc86958af681c78d833fb1ede91de0c7b797"} Feb 28 04:20:04 crc kubenswrapper[5072]: I0228 04:20:04.567708 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537540-rcqfw" Feb 28 04:20:04 crc kubenswrapper[5072]: I0228 04:20:04.704393 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdgcn\" (UniqueName: \"kubernetes.io/projected/e7aa2b36-997c-4e7e-b869-a116e9e9fd74-kube-api-access-kdgcn\") pod \"e7aa2b36-997c-4e7e-b869-a116e9e9fd74\" (UID: \"e7aa2b36-997c-4e7e-b869-a116e9e9fd74\") " Feb 28 04:20:04 crc kubenswrapper[5072]: I0228 04:20:04.708577 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7aa2b36-997c-4e7e-b869-a116e9e9fd74-kube-api-access-kdgcn" (OuterVolumeSpecName: "kube-api-access-kdgcn") pod "e7aa2b36-997c-4e7e-b869-a116e9e9fd74" (UID: "e7aa2b36-997c-4e7e-b869-a116e9e9fd74"). InnerVolumeSpecName "kube-api-access-kdgcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:20:04 crc kubenswrapper[5072]: I0228 04:20:04.805434 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdgcn\" (UniqueName: \"kubernetes.io/projected/e7aa2b36-997c-4e7e-b869-a116e9e9fd74-kube-api-access-kdgcn\") on node \"crc\" DevicePath \"\"" Feb 28 04:20:05 crc kubenswrapper[5072]: I0228 04:20:05.317189 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537540-rcqfw" event={"ID":"e7aa2b36-997c-4e7e-b869-a116e9e9fd74","Type":"ContainerDied","Data":"31b964b4151283cbdcb1131e5a99e5651fd8f2d18e9bb561abb96bdfc614d64a"} Feb 28 04:20:05 crc kubenswrapper[5072]: I0228 04:20:05.317264 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31b964b4151283cbdcb1131e5a99e5651fd8f2d18e9bb561abb96bdfc614d64a" Feb 28 04:20:05 crc kubenswrapper[5072]: I0228 04:20:05.317326 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537540-rcqfw" Feb 28 04:20:05 crc kubenswrapper[5072]: I0228 04:20:05.354337 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537534-5p5md"] Feb 28 04:20:05 crc kubenswrapper[5072]: I0228 04:20:05.358695 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537534-5p5md"] Feb 28 04:20:06 crc kubenswrapper[5072]: I0228 04:20:06.664993 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0ab1ca7-1f19-4416-9ebb-95ceffdc14ac" path="/var/lib/kubelet/pods/a0ab1ca7-1f19-4416-9ebb-95ceffdc14ac/volumes" Feb 28 04:20:20 crc kubenswrapper[5072]: I0228 04:20:20.105767 5072 patch_prober.go:28] interesting pod/machine-config-daemon-5lrpf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:20:20 crc kubenswrapper[5072]: I0228 04:20:20.106379 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:20:50 crc kubenswrapper[5072]: I0228 04:20:50.106164 5072 patch_prober.go:28] interesting pod/machine-config-daemon-5lrpf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:20:50 crc kubenswrapper[5072]: I0228 04:20:50.106951 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:20:50 crc kubenswrapper[5072]: I0228 04:20:50.107007 5072 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" Feb 28 04:20:50 crc kubenswrapper[5072]: I0228 04:20:50.107678 5072 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4995e81e3b0a747e3fcdeca516d170f1d3f4a30b7f4d30dcd3a95695e36c9e2c"} pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 28 04:20:50 crc kubenswrapper[5072]: I0228 04:20:50.107772 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" containerID="cri-o://4995e81e3b0a747e3fcdeca516d170f1d3f4a30b7f4d30dcd3a95695e36c9e2c" gracePeriod=600 Feb 28 04:20:50 crc kubenswrapper[5072]: I0228 04:20:50.577485 5072 generic.go:334] "Generic (PLEG): container finished" podID="a035bbab-1d8f-4120-aaf7-88984d936939" containerID="4995e81e3b0a747e3fcdeca516d170f1d3f4a30b7f4d30dcd3a95695e36c9e2c" exitCode=0 Feb 28 04:20:50 crc kubenswrapper[5072]: I0228 04:20:50.577574 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" event={"ID":"a035bbab-1d8f-4120-aaf7-88984d936939","Type":"ContainerDied","Data":"4995e81e3b0a747e3fcdeca516d170f1d3f4a30b7f4d30dcd3a95695e36c9e2c"} Feb 28 04:20:50 crc kubenswrapper[5072]: I0228 04:20:50.577675 5072 scope.go:117] "RemoveContainer" containerID="41cc17a140060c4cb6a238bd77382a9cc2f3dd6470af9dc5b7a487f87ffd0e35" Feb 28 04:20:51 crc kubenswrapper[5072]: I0228 04:20:51.589822 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" event={"ID":"a035bbab-1d8f-4120-aaf7-88984d936939","Type":"ContainerStarted","Data":"a7da6d10ce5d74918d539d5f69d6835b46ff28621ce44b337a029f6864cad079"} Feb 28 04:21:20 crc kubenswrapper[5072]: I0228 04:21:20.054752 5072 scope.go:117] "RemoveContainer" containerID="95d0fc9d8c611beafb142746f4af121ed67a7672b0fe8f9de762c642787f81f9" Feb 28 04:21:20 crc kubenswrapper[5072]: I0228 04:21:20.092112 5072 scope.go:117] "RemoveContainer" containerID="4399afb27e957fc7a9641764c0f29b17e57c2b8b1dd42a347c19d81079014e38" Feb 28 04:22:00 crc kubenswrapper[5072]: I0228 04:22:00.134013 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537542-98224"] Feb 28 04:22:00 crc kubenswrapper[5072]: E0228 04:22:00.134988 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7aa2b36-997c-4e7e-b869-a116e9e9fd74" containerName="oc" Feb 28 04:22:00 crc kubenswrapper[5072]: I0228 04:22:00.135008 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7aa2b36-997c-4e7e-b869-a116e9e9fd74" containerName="oc" Feb 28 04:22:00 crc kubenswrapper[5072]: I0228 04:22:00.135134 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7aa2b36-997c-4e7e-b869-a116e9e9fd74" containerName="oc" Feb 28 04:22:00 crc kubenswrapper[5072]: I0228 04:22:00.135605 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537542-98224" Feb 28 04:22:00 crc kubenswrapper[5072]: I0228 04:22:00.138701 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 04:22:00 crc kubenswrapper[5072]: I0228 04:22:00.138845 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 04:22:00 crc kubenswrapper[5072]: I0228 04:22:00.140201 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-c8kvx" Feb 28 04:22:00 crc kubenswrapper[5072]: I0228 04:22:00.143234 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537542-98224"] Feb 28 04:22:00 crc kubenswrapper[5072]: I0228 04:22:00.154088 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxpqh\" (UniqueName: \"kubernetes.io/projected/1652ce01-0324-46d3-8f09-e946acb926e4-kube-api-access-mxpqh\") pod \"auto-csr-approver-29537542-98224\" (UID: \"1652ce01-0324-46d3-8f09-e946acb926e4\") " pod="openshift-infra/auto-csr-approver-29537542-98224" Feb 28 04:22:00 crc kubenswrapper[5072]: I0228 04:22:00.255022 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxpqh\" (UniqueName: \"kubernetes.io/projected/1652ce01-0324-46d3-8f09-e946acb926e4-kube-api-access-mxpqh\") pod \"auto-csr-approver-29537542-98224\" (UID: \"1652ce01-0324-46d3-8f09-e946acb926e4\") " pod="openshift-infra/auto-csr-approver-29537542-98224" Feb 28 04:22:00 crc kubenswrapper[5072]: I0228 04:22:00.276619 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxpqh\" (UniqueName: \"kubernetes.io/projected/1652ce01-0324-46d3-8f09-e946acb926e4-kube-api-access-mxpqh\") pod \"auto-csr-approver-29537542-98224\" (UID: \"1652ce01-0324-46d3-8f09-e946acb926e4\") " pod="openshift-infra/auto-csr-approver-29537542-98224" Feb 28 04:22:00 crc kubenswrapper[5072]: I0228 04:22:00.461160 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537542-98224" Feb 28 04:22:00 crc kubenswrapper[5072]: I0228 04:22:00.698219 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537542-98224"] Feb 28 04:22:00 crc kubenswrapper[5072]: I0228 04:22:00.988517 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537542-98224" event={"ID":"1652ce01-0324-46d3-8f09-e946acb926e4","Type":"ContainerStarted","Data":"477ea832cc6f007d6731d4392b6c8cfa21365a7a150310d5888531e799347df5"} Feb 28 04:22:01 crc kubenswrapper[5072]: I0228 04:22:01.994343 5072 generic.go:334] "Generic (PLEG): container finished" podID="1652ce01-0324-46d3-8f09-e946acb926e4" containerID="b027ac162ddf00b261727df3c966b678a3f6e1c5500fe190c8c13f10a09e355f" exitCode=0 Feb 28 04:22:01 crc kubenswrapper[5072]: I0228 04:22:01.994447 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537542-98224" event={"ID":"1652ce01-0324-46d3-8f09-e946acb926e4","Type":"ContainerDied","Data":"b027ac162ddf00b261727df3c966b678a3f6e1c5500fe190c8c13f10a09e355f"} Feb 28 04:22:03 crc kubenswrapper[5072]: I0228 04:22:03.172252 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537542-98224" Feb 28 04:22:03 crc kubenswrapper[5072]: I0228 04:22:03.295122 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxpqh\" (UniqueName: \"kubernetes.io/projected/1652ce01-0324-46d3-8f09-e946acb926e4-kube-api-access-mxpqh\") pod \"1652ce01-0324-46d3-8f09-e946acb926e4\" (UID: \"1652ce01-0324-46d3-8f09-e946acb926e4\") " Feb 28 04:22:03 crc kubenswrapper[5072]: I0228 04:22:03.299918 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1652ce01-0324-46d3-8f09-e946acb926e4-kube-api-access-mxpqh" (OuterVolumeSpecName: "kube-api-access-mxpqh") pod "1652ce01-0324-46d3-8f09-e946acb926e4" (UID: "1652ce01-0324-46d3-8f09-e946acb926e4"). InnerVolumeSpecName "kube-api-access-mxpqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:22:03 crc kubenswrapper[5072]: I0228 04:22:03.396812 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxpqh\" (UniqueName: \"kubernetes.io/projected/1652ce01-0324-46d3-8f09-e946acb926e4-kube-api-access-mxpqh\") on node \"crc\" DevicePath \"\"" Feb 28 04:22:04 crc kubenswrapper[5072]: I0228 04:22:04.011278 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537542-98224" event={"ID":"1652ce01-0324-46d3-8f09-e946acb926e4","Type":"ContainerDied","Data":"477ea832cc6f007d6731d4392b6c8cfa21365a7a150310d5888531e799347df5"} Feb 28 04:22:04 crc kubenswrapper[5072]: I0228 04:22:04.011328 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="477ea832cc6f007d6731d4392b6c8cfa21365a7a150310d5888531e799347df5" Feb 28 04:22:04 crc kubenswrapper[5072]: I0228 04:22:04.011388 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537542-98224" Feb 28 04:22:04 crc kubenswrapper[5072]: I0228 04:22:04.231185 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537536-n4b67"] Feb 28 04:22:04 crc kubenswrapper[5072]: I0228 04:22:04.236160 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537536-n4b67"] Feb 28 04:22:04 crc kubenswrapper[5072]: I0228 04:22:04.667876 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55524219-a6bb-4464-954e-2a8726c25a20" path="/var/lib/kubelet/pods/55524219-a6bb-4464-954e-2a8726c25a20/volumes" Feb 28 04:22:20 crc kubenswrapper[5072]: I0228 04:22:20.145909 5072 scope.go:117] "RemoveContainer" containerID="621788dbc76b372a609765bf1a78d22c609f24c553336f73fcd9ac806eaba91f" Feb 28 04:22:50 crc kubenswrapper[5072]: I0228 04:22:50.105451 5072 patch_prober.go:28] interesting pod/machine-config-daemon-5lrpf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:22:50 crc kubenswrapper[5072]: I0228 04:22:50.106317 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.039861 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kfpqp"] Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.040259 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="ovn-controller" containerID="cri-o://71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e" gracePeriod=30 Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.040382 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="nbdb" containerID="cri-o://84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e" gracePeriod=30 Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.040369 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8" gracePeriod=30 Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.040431 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="northd" containerID="cri-o://2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906" gracePeriod=30 Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.040430 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="ovn-acl-logging" containerID="cri-o://4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456" gracePeriod=30 Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.040474 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="kube-rbac-proxy-node" containerID="cri-o://91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0" gracePeriod=30 Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.040515 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="sbdb" containerID="cri-o://1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7" gracePeriod=30 Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.074440 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="ovnkube-controller" containerID="cri-o://d9ead4e3700bb4d4844b72459a0b5e14216c9b711ffa9ed2f2930565eca72f37" gracePeriod=30 Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.340680 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8pz98_ae699423-376d-4342-bf44-7d70f68fadd1/kube-multus/2.log" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.341005 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8pz98_ae699423-376d-4342-bf44-7d70f68fadd1/kube-multus/1.log" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.341033 5072 generic.go:334] "Generic (PLEG): container finished" podID="ae699423-376d-4342-bf44-7d70f68fadd1" containerID="7e741ee8743c0d8ce1eff62104ff5ebc16f00b9727d0e70f0a3c873cc615ed38" exitCode=2 Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.341072 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8pz98" event={"ID":"ae699423-376d-4342-bf44-7d70f68fadd1","Type":"ContainerDied","Data":"7e741ee8743c0d8ce1eff62104ff5ebc16f00b9727d0e70f0a3c873cc615ed38"} Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.341107 5072 scope.go:117] "RemoveContainer" containerID="f76011dc7c4eafa5461342a10320cb4acfd8b21eeb9293364d11c7a30744e0aa" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.341443 5072 scope.go:117] "RemoveContainer" containerID="7e741ee8743c0d8ce1eff62104ff5ebc16f00b9727d0e70f0a3c873cc615ed38" Feb 28 04:22:51 crc kubenswrapper[5072]: E0228 04:22:51.341770 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-8pz98_openshift-multus(ae699423-376d-4342-bf44-7d70f68fadd1)\"" pod="openshift-multus/multus-8pz98" podUID="ae699423-376d-4342-bf44-7d70f68fadd1" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.344392 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kfpqp_043491df-2577-47f6-9a5b-03fecada16ce/ovnkube-controller/3.log" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.347477 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kfpqp_043491df-2577-47f6-9a5b-03fecada16ce/ovn-acl-logging/0.log" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.347905 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kfpqp_043491df-2577-47f6-9a5b-03fecada16ce/ovn-controller/0.log" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.348211 5072 generic.go:334] "Generic (PLEG): container finished" podID="043491df-2577-47f6-9a5b-03fecada16ce" containerID="d9ead4e3700bb4d4844b72459a0b5e14216c9b711ffa9ed2f2930565eca72f37" exitCode=0 Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.348277 5072 generic.go:334] "Generic (PLEG): container finished" podID="043491df-2577-47f6-9a5b-03fecada16ce" containerID="1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7" exitCode=0 Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.348287 5072 generic.go:334] "Generic (PLEG): container finished" podID="043491df-2577-47f6-9a5b-03fecada16ce" containerID="84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e" exitCode=0 Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.348294 5072 generic.go:334] "Generic (PLEG): container finished" podID="043491df-2577-47f6-9a5b-03fecada16ce" containerID="7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8" exitCode=0 Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.348302 5072 generic.go:334] "Generic (PLEG): container finished" podID="043491df-2577-47f6-9a5b-03fecada16ce" containerID="91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0" exitCode=0 Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.348310 5072 generic.go:334] "Generic (PLEG): container finished" podID="043491df-2577-47f6-9a5b-03fecada16ce" containerID="4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456" exitCode=143 Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.348317 5072 generic.go:334] "Generic (PLEG): container finished" podID="043491df-2577-47f6-9a5b-03fecada16ce" containerID="71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e" exitCode=143 Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.348334 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" event={"ID":"043491df-2577-47f6-9a5b-03fecada16ce","Type":"ContainerDied","Data":"d9ead4e3700bb4d4844b72459a0b5e14216c9b711ffa9ed2f2930565eca72f37"} Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.348377 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" event={"ID":"043491df-2577-47f6-9a5b-03fecada16ce","Type":"ContainerDied","Data":"1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7"} Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.348389 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" event={"ID":"043491df-2577-47f6-9a5b-03fecada16ce","Type":"ContainerDied","Data":"84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e"} Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.348398 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" event={"ID":"043491df-2577-47f6-9a5b-03fecada16ce","Type":"ContainerDied","Data":"7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8"} Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.348406 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" event={"ID":"043491df-2577-47f6-9a5b-03fecada16ce","Type":"ContainerDied","Data":"91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0"} Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.348415 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" event={"ID":"043491df-2577-47f6-9a5b-03fecada16ce","Type":"ContainerDied","Data":"4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456"} Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.348423 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" event={"ID":"043491df-2577-47f6-9a5b-03fecada16ce","Type":"ContainerDied","Data":"71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e"} Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.402239 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kfpqp_043491df-2577-47f6-9a5b-03fecada16ce/ovnkube-controller/3.log" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.404051 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kfpqp_043491df-2577-47f6-9a5b-03fecada16ce/ovn-acl-logging/0.log" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.404425 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kfpqp_043491df-2577-47f6-9a5b-03fecada16ce/ovn-controller/0.log" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.404787 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.406046 5072 scope.go:117] "RemoveContainer" containerID="edf6968db2420b0ba248f891852f08cbe7f9e463c9c1c480317d1e388f377057" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.440820 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-kubelet\") pod \"043491df-2577-47f6-9a5b-03fecada16ce\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.440872 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-cni-netd\") pod \"043491df-2577-47f6-9a5b-03fecada16ce\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.440907 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/043491df-2577-47f6-9a5b-03fecada16ce-ovnkube-script-lib\") pod \"043491df-2577-47f6-9a5b-03fecada16ce\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.440925 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/043491df-2577-47f6-9a5b-03fecada16ce-ovn-node-metrics-cert\") pod \"043491df-2577-47f6-9a5b-03fecada16ce\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.440941 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-cni-bin\") pod \"043491df-2577-47f6-9a5b-03fecada16ce\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.440958 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-log-socket\") pod \"043491df-2577-47f6-9a5b-03fecada16ce\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.440975 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-run-systemd\") pod \"043491df-2577-47f6-9a5b-03fecada16ce\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.440991 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-run-ovn\") pod \"043491df-2577-47f6-9a5b-03fecada16ce\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.441006 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/043491df-2577-47f6-9a5b-03fecada16ce-ovnkube-config\") pod \"043491df-2577-47f6-9a5b-03fecada16ce\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.441021 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-var-lib-openvswitch\") pod \"043491df-2577-47f6-9a5b-03fecada16ce\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.441036 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-etc-openvswitch\") pod \"043491df-2577-47f6-9a5b-03fecada16ce\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.441048 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-systemd-units\") pod \"043491df-2577-47f6-9a5b-03fecada16ce\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.441061 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-node-log\") pod \"043491df-2577-47f6-9a5b-03fecada16ce\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.441079 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvpck\" (UniqueName: \"kubernetes.io/projected/043491df-2577-47f6-9a5b-03fecada16ce-kube-api-access-pvpck\") pod \"043491df-2577-47f6-9a5b-03fecada16ce\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.441101 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-var-lib-cni-networks-ovn-kubernetes\") pod \"043491df-2577-47f6-9a5b-03fecada16ce\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.441120 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-slash\") pod \"043491df-2577-47f6-9a5b-03fecada16ce\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.441132 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-run-openvswitch\") pod \"043491df-2577-47f6-9a5b-03fecada16ce\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.441148 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/043491df-2577-47f6-9a5b-03fecada16ce-env-overrides\") pod \"043491df-2577-47f6-9a5b-03fecada16ce\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.441165 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-run-ovn-kubernetes\") pod \"043491df-2577-47f6-9a5b-03fecada16ce\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.441186 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-run-netns\") pod \"043491df-2577-47f6-9a5b-03fecada16ce\" (UID: \"043491df-2577-47f6-9a5b-03fecada16ce\") " Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.444843 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "043491df-2577-47f6-9a5b-03fecada16ce" (UID: "043491df-2577-47f6-9a5b-03fecada16ce"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.444887 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "043491df-2577-47f6-9a5b-03fecada16ce" (UID: "043491df-2577-47f6-9a5b-03fecada16ce"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.445020 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "043491df-2577-47f6-9a5b-03fecada16ce" (UID: "043491df-2577-47f6-9a5b-03fecada16ce"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.444914 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-log-socket" (OuterVolumeSpecName: "log-socket") pod "043491df-2577-47f6-9a5b-03fecada16ce" (UID: "043491df-2577-47f6-9a5b-03fecada16ce"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.445115 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "043491df-2577-47f6-9a5b-03fecada16ce" (UID: "043491df-2577-47f6-9a5b-03fecada16ce"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.445142 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "043491df-2577-47f6-9a5b-03fecada16ce" (UID: "043491df-2577-47f6-9a5b-03fecada16ce"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.445265 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "043491df-2577-47f6-9a5b-03fecada16ce" (UID: "043491df-2577-47f6-9a5b-03fecada16ce"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.445347 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "043491df-2577-47f6-9a5b-03fecada16ce" (UID: "043491df-2577-47f6-9a5b-03fecada16ce"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.445055 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-node-log" (OuterVolumeSpecName: "node-log") pod "043491df-2577-47f6-9a5b-03fecada16ce" (UID: "043491df-2577-47f6-9a5b-03fecada16ce"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.445604 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "043491df-2577-47f6-9a5b-03fecada16ce" (UID: "043491df-2577-47f6-9a5b-03fecada16ce"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.445656 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "043491df-2577-47f6-9a5b-03fecada16ce" (UID: "043491df-2577-47f6-9a5b-03fecada16ce"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.445719 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/043491df-2577-47f6-9a5b-03fecada16ce-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "043491df-2577-47f6-9a5b-03fecada16ce" (UID: "043491df-2577-47f6-9a5b-03fecada16ce"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.445772 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/043491df-2577-47f6-9a5b-03fecada16ce-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "043491df-2577-47f6-9a5b-03fecada16ce" (UID: "043491df-2577-47f6-9a5b-03fecada16ce"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.445853 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-slash" (OuterVolumeSpecName: "host-slash") pod "043491df-2577-47f6-9a5b-03fecada16ce" (UID: "043491df-2577-47f6-9a5b-03fecada16ce"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.445892 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "043491df-2577-47f6-9a5b-03fecada16ce" (UID: "043491df-2577-47f6-9a5b-03fecada16ce"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.446460 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "043491df-2577-47f6-9a5b-03fecada16ce" (UID: "043491df-2577-47f6-9a5b-03fecada16ce"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.447628 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/043491df-2577-47f6-9a5b-03fecada16ce-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "043491df-2577-47f6-9a5b-03fecada16ce" (UID: "043491df-2577-47f6-9a5b-03fecada16ce"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.452124 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-brbjn"] Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.452416 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/043491df-2577-47f6-9a5b-03fecada16ce-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "043491df-2577-47f6-9a5b-03fecada16ce" (UID: "043491df-2577-47f6-9a5b-03fecada16ce"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:22:51 crc kubenswrapper[5072]: E0228 04:22:51.452518 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="ovnkube-controller" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.452534 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="ovnkube-controller" Feb 28 04:22:51 crc kubenswrapper[5072]: E0228 04:22:51.452544 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="kube-rbac-proxy-node" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.452550 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="kube-rbac-proxy-node" Feb 28 04:22:51 crc kubenswrapper[5072]: E0228 04:22:51.452561 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="kubecfg-setup" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.452567 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="kubecfg-setup" Feb 28 04:22:51 crc kubenswrapper[5072]: E0228 04:22:51.452573 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="ovnkube-controller" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.452579 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="ovnkube-controller" Feb 28 04:22:51 crc kubenswrapper[5072]: E0228 04:22:51.452587 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="ovnkube-controller" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.452592 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="ovnkube-controller" Feb 28 04:22:51 crc kubenswrapper[5072]: E0228 04:22:51.452600 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="ovn-acl-logging" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.452605 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="ovn-acl-logging" Feb 28 04:22:51 crc kubenswrapper[5072]: E0228 04:22:51.452614 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="ovn-controller" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.452621 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="ovn-controller" Feb 28 04:22:51 crc kubenswrapper[5072]: E0228 04:22:51.452627 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="kube-rbac-proxy-ovn-metrics" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.452632 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="kube-rbac-proxy-ovn-metrics" Feb 28 04:22:51 crc kubenswrapper[5072]: E0228 04:22:51.452663 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="ovnkube-controller" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.452671 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="ovnkube-controller" Feb 28 04:22:51 crc kubenswrapper[5072]: E0228 04:22:51.452680 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1652ce01-0324-46d3-8f09-e946acb926e4" containerName="oc" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.452686 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="1652ce01-0324-46d3-8f09-e946acb926e4" containerName="oc" Feb 28 04:22:51 crc kubenswrapper[5072]: E0228 04:22:51.452696 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="nbdb" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.452702 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="nbdb" Feb 28 04:22:51 crc kubenswrapper[5072]: E0228 04:22:51.452708 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="sbdb" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.452713 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="sbdb" Feb 28 04:22:51 crc kubenswrapper[5072]: E0228 04:22:51.452722 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="northd" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.452727 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="northd" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.452812 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="ovnkube-controller" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.452821 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="ovnkube-controller" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.452829 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="northd" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.452838 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="ovnkube-controller" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.452845 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="ovn-controller" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.452854 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="ovnkube-controller" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.452861 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="ovnkube-controller" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.452869 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="kube-rbac-proxy-node" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.452877 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="1652ce01-0324-46d3-8f09-e946acb926e4" containerName="oc" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.452884 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="ovn-acl-logging" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.452891 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="nbdb" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.452898 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="kube-rbac-proxy-ovn-metrics" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.452917 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="sbdb" Feb 28 04:22:51 crc kubenswrapper[5072]: E0228 04:22:51.452993 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="ovnkube-controller" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.452999 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="043491df-2577-47f6-9a5b-03fecada16ce" containerName="ovnkube-controller" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.454134 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/043491df-2577-47f6-9a5b-03fecada16ce-kube-api-access-pvpck" (OuterVolumeSpecName: "kube-api-access-pvpck") pod "043491df-2577-47f6-9a5b-03fecada16ce" (UID: "043491df-2577-47f6-9a5b-03fecada16ce"). InnerVolumeSpecName "kube-api-access-pvpck". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.454773 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.463037 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "043491df-2577-47f6-9a5b-03fecada16ce" (UID: "043491df-2577-47f6-9a5b-03fecada16ce"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.542447 5072 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-log-socket\") on node \"crc\" DevicePath \"\"" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.542707 5072 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.542772 5072 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.542893 5072 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/043491df-2577-47f6-9a5b-03fecada16ce-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.542911 5072 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.542922 5072 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.542931 5072 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.542939 5072 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-node-log\") on node \"crc\" DevicePath \"\"" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.542948 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvpck\" (UniqueName: \"kubernetes.io/projected/043491df-2577-47f6-9a5b-03fecada16ce-kube-api-access-pvpck\") on node \"crc\" DevicePath \"\"" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.542957 5072 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.542965 5072 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-slash\") on node \"crc\" DevicePath \"\"" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.542974 5072 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.542982 5072 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/043491df-2577-47f6-9a5b-03fecada16ce-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.542991 5072 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.542999 5072 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.543007 5072 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.543014 5072 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.543022 5072 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/043491df-2577-47f6-9a5b-03fecada16ce-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.543032 5072 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/043491df-2577-47f6-9a5b-03fecada16ce-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.543041 5072 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/043491df-2577-47f6-9a5b-03fecada16ce-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.643548 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-host-slash\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.643750 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-ovnkube-script-lib\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.643868 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-systemd-units\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.643973 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-host-cni-bin\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.644060 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-etc-openvswitch\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.644155 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-log-socket\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.644249 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-run-ovn\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.644339 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.644559 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-ovnkube-config\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.644742 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-host-cni-netd\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.644837 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-host-run-ovn-kubernetes\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.644867 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-run-openvswitch\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.644899 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-run-systemd\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.644926 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-node-log\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.644959 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-host-run-netns\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.644983 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-env-overrides\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.645010 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzdp2\" (UniqueName: \"kubernetes.io/projected/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-kube-api-access-fzdp2\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.645037 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-host-kubelet\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.645060 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-var-lib-openvswitch\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.645081 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-ovn-node-metrics-cert\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.745578 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzdp2\" (UniqueName: \"kubernetes.io/projected/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-kube-api-access-fzdp2\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.745862 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-host-kubelet\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.745971 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-ovn-node-metrics-cert\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.746047 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-var-lib-openvswitch\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.746002 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-host-kubelet\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.746208 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-host-slash\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.746134 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-var-lib-openvswitch\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.746127 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-host-slash\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.746406 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-ovnkube-script-lib\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.746471 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-systemd-units\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.746528 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-host-cni-bin\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.746592 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-etc-openvswitch\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.746688 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-log-socket\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.746777 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-run-ovn\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.746949 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.747066 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-ovnkube-config\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.747138 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.747145 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-host-cni-netd\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.747188 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-run-openvswitch\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.747208 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-host-run-ovn-kubernetes\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.747237 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-run-systemd\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.747255 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-log-socket\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.747262 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-node-log\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.747305 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-run-ovn\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.746857 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-etc-openvswitch\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.747384 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-env-overrides\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.747407 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-host-run-netns\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.747496 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-host-cni-netd\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.747520 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-run-openvswitch\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.747527 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-host-run-netns\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.747542 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-host-run-ovn-kubernetes\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.747564 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-run-systemd\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.746876 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-systemd-units\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.748202 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-env-overrides\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.747284 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-node-log\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.746891 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-host-cni-bin\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.748304 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-ovnkube-script-lib\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.748552 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-ovnkube-config\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.750609 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-ovn-node-metrics-cert\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.762463 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzdp2\" (UniqueName: \"kubernetes.io/projected/80296180-6b5d-4baa-ac35-b9d9b20bf1d4-kube-api-access-fzdp2\") pod \"ovnkube-node-brbjn\" (UID: \"80296180-6b5d-4baa-ac35-b9d9b20bf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:51 crc kubenswrapper[5072]: I0228 04:22:51.776488 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.358601 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kfpqp_043491df-2577-47f6-9a5b-03fecada16ce/ovn-acl-logging/0.log" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.360310 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kfpqp_043491df-2577-47f6-9a5b-03fecada16ce/ovn-controller/0.log" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.360854 5072 generic.go:334] "Generic (PLEG): container finished" podID="043491df-2577-47f6-9a5b-03fecada16ce" containerID="2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906" exitCode=0 Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.360938 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" event={"ID":"043491df-2577-47f6-9a5b-03fecada16ce","Type":"ContainerDied","Data":"2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906"} Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.360964 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" event={"ID":"043491df-2577-47f6-9a5b-03fecada16ce","Type":"ContainerDied","Data":"859207fea6a0d5434ba1a1de90bfddda710cea2d855120a8c045ab8b2aa9e1f9"} Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.360985 5072 scope.go:117] "RemoveContainer" containerID="d9ead4e3700bb4d4844b72459a0b5e14216c9b711ffa9ed2f2930565eca72f37" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.361137 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kfpqp" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.362545 5072 generic.go:334] "Generic (PLEG): container finished" podID="80296180-6b5d-4baa-ac35-b9d9b20bf1d4" containerID="51202cd03e31a5706b7d179b0bef9595ff15d93f23808712a6ac70f44b03174b" exitCode=0 Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.362673 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" event={"ID":"80296180-6b5d-4baa-ac35-b9d9b20bf1d4","Type":"ContainerDied","Data":"51202cd03e31a5706b7d179b0bef9595ff15d93f23808712a6ac70f44b03174b"} Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.362720 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" event={"ID":"80296180-6b5d-4baa-ac35-b9d9b20bf1d4","Type":"ContainerStarted","Data":"a6c3051831a851566ba2ba37f0d2e3c0c4231f1a7a0c65d5cb8e6501544ab76e"} Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.369867 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8pz98_ae699423-376d-4342-bf44-7d70f68fadd1/kube-multus/2.log" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.397080 5072 scope.go:117] "RemoveContainer" containerID="1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.419232 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kfpqp"] Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.423911 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kfpqp"] Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.436838 5072 scope.go:117] "RemoveContainer" containerID="84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.480941 5072 scope.go:117] "RemoveContainer" containerID="2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.506051 5072 scope.go:117] "RemoveContainer" containerID="7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.517611 5072 scope.go:117] "RemoveContainer" containerID="91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.539154 5072 scope.go:117] "RemoveContainer" containerID="4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.556362 5072 scope.go:117] "RemoveContainer" containerID="71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.575338 5072 scope.go:117] "RemoveContainer" containerID="604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.593058 5072 scope.go:117] "RemoveContainer" containerID="d9ead4e3700bb4d4844b72459a0b5e14216c9b711ffa9ed2f2930565eca72f37" Feb 28 04:22:52 crc kubenswrapper[5072]: E0228 04:22:52.593538 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9ead4e3700bb4d4844b72459a0b5e14216c9b711ffa9ed2f2930565eca72f37\": container with ID starting with d9ead4e3700bb4d4844b72459a0b5e14216c9b711ffa9ed2f2930565eca72f37 not found: ID does not exist" containerID="d9ead4e3700bb4d4844b72459a0b5e14216c9b711ffa9ed2f2930565eca72f37" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.593569 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9ead4e3700bb4d4844b72459a0b5e14216c9b711ffa9ed2f2930565eca72f37"} err="failed to get container status \"d9ead4e3700bb4d4844b72459a0b5e14216c9b711ffa9ed2f2930565eca72f37\": rpc error: code = NotFound desc = could not find container \"d9ead4e3700bb4d4844b72459a0b5e14216c9b711ffa9ed2f2930565eca72f37\": container with ID starting with d9ead4e3700bb4d4844b72459a0b5e14216c9b711ffa9ed2f2930565eca72f37 not found: ID does not exist" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.593588 5072 scope.go:117] "RemoveContainer" containerID="1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7" Feb 28 04:22:52 crc kubenswrapper[5072]: E0228 04:22:52.593935 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7\": container with ID starting with 1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7 not found: ID does not exist" containerID="1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.593955 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7"} err="failed to get container status \"1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7\": rpc error: code = NotFound desc = could not find container \"1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7\": container with ID starting with 1a89040c60374eeb830f4874c6aa31b63503ea82fb3b905a6d277e74bfd94bf7 not found: ID does not exist" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.593968 5072 scope.go:117] "RemoveContainer" containerID="84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e" Feb 28 04:22:52 crc kubenswrapper[5072]: E0228 04:22:52.594348 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e\": container with ID starting with 84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e not found: ID does not exist" containerID="84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.594396 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e"} err="failed to get container status \"84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e\": rpc error: code = NotFound desc = could not find container \"84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e\": container with ID starting with 84f55cf9664297a6d9362e9b287af8cb040499c6bb26e1399f965d3598282a6e not found: ID does not exist" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.594431 5072 scope.go:117] "RemoveContainer" containerID="2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906" Feb 28 04:22:52 crc kubenswrapper[5072]: E0228 04:22:52.595016 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906\": container with ID starting with 2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906 not found: ID does not exist" containerID="2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.595044 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906"} err="failed to get container status \"2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906\": rpc error: code = NotFound desc = could not find container \"2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906\": container with ID starting with 2f874bdbb43d0d89949890b40d1f3ca9e46452e816bf9069694ec56b216b3906 not found: ID does not exist" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.595059 5072 scope.go:117] "RemoveContainer" containerID="7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8" Feb 28 04:22:52 crc kubenswrapper[5072]: E0228 04:22:52.595259 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8\": container with ID starting with 7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8 not found: ID does not exist" containerID="7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.595277 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8"} err="failed to get container status \"7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8\": rpc error: code = NotFound desc = could not find container \"7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8\": container with ID starting with 7dc86e807d2fd29a23cd65b4e3cf4b1f80d30a2a9edd3513951cd5b0952d7ea8 not found: ID does not exist" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.595293 5072 scope.go:117] "RemoveContainer" containerID="91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0" Feb 28 04:22:52 crc kubenswrapper[5072]: E0228 04:22:52.595523 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0\": container with ID starting with 91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0 not found: ID does not exist" containerID="91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.595551 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0"} err="failed to get container status \"91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0\": rpc error: code = NotFound desc = could not find container \"91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0\": container with ID starting with 91b4c2ad8c3c86aa5ea8faa262339e86b4f63cd37af39e7831e5cb895bf121e0 not found: ID does not exist" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.595573 5072 scope.go:117] "RemoveContainer" containerID="4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456" Feb 28 04:22:52 crc kubenswrapper[5072]: E0228 04:22:52.595944 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456\": container with ID starting with 4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456 not found: ID does not exist" containerID="4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.595966 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456"} err="failed to get container status \"4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456\": rpc error: code = NotFound desc = could not find container \"4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456\": container with ID starting with 4d255f482994f394f8978b6a0bb07d947f055bd25c479763d2c7c349d28b7456 not found: ID does not exist" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.595999 5072 scope.go:117] "RemoveContainer" containerID="71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e" Feb 28 04:22:52 crc kubenswrapper[5072]: E0228 04:22:52.596469 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e\": container with ID starting with 71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e not found: ID does not exist" containerID="71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.596496 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e"} err="failed to get container status \"71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e\": rpc error: code = NotFound desc = could not find container \"71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e\": container with ID starting with 71ea9e226a922b61bc00e0375234533e44b02d63d8c17966fa59eaafb56b687e not found: ID does not exist" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.596512 5072 scope.go:117] "RemoveContainer" containerID="604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515" Feb 28 04:22:52 crc kubenswrapper[5072]: E0228 04:22:52.596791 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\": container with ID starting with 604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515 not found: ID does not exist" containerID="604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.596810 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515"} err="failed to get container status \"604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\": rpc error: code = NotFound desc = could not find container \"604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515\": container with ID starting with 604b9064779dfbcded213dd20d47aeb40b796476d9cc6abb00e9700b1b3a7515 not found: ID does not exist" Feb 28 04:22:52 crc kubenswrapper[5072]: I0228 04:22:52.665493 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="043491df-2577-47f6-9a5b-03fecada16ce" path="/var/lib/kubelet/pods/043491df-2577-47f6-9a5b-03fecada16ce/volumes" Feb 28 04:22:53 crc kubenswrapper[5072]: I0228 04:22:53.380773 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" event={"ID":"80296180-6b5d-4baa-ac35-b9d9b20bf1d4","Type":"ContainerStarted","Data":"1ed271ada1146048eebfd87c0c8b62f05cb12ef3778c8df2b0e7d6b0b115a70c"} Feb 28 04:22:53 crc kubenswrapper[5072]: I0228 04:22:53.381187 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" event={"ID":"80296180-6b5d-4baa-ac35-b9d9b20bf1d4","Type":"ContainerStarted","Data":"01106f568ff9bfbf82d796d9200f6daad97497c1ca9bb046079346b2894b7754"} Feb 28 04:22:53 crc kubenswrapper[5072]: I0228 04:22:53.381202 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" event={"ID":"80296180-6b5d-4baa-ac35-b9d9b20bf1d4","Type":"ContainerStarted","Data":"8eb6382c27f9d999388d6f6cf9a466363025fc9fd1086d361ab760d082aad97f"} Feb 28 04:22:53 crc kubenswrapper[5072]: I0228 04:22:53.381213 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" event={"ID":"80296180-6b5d-4baa-ac35-b9d9b20bf1d4","Type":"ContainerStarted","Data":"cb66cd3e8f88a9d8222cfa5eda4e75ebc70c69ba31c3da60a18d2fbfd6632daf"} Feb 28 04:22:53 crc kubenswrapper[5072]: I0228 04:22:53.381222 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" event={"ID":"80296180-6b5d-4baa-ac35-b9d9b20bf1d4","Type":"ContainerStarted","Data":"012f112194b86c08b7b7149e8e0f9f0aaef26cc61667c1a0e6eed7ecd030b03c"} Feb 28 04:22:53 crc kubenswrapper[5072]: I0228 04:22:53.381231 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" event={"ID":"80296180-6b5d-4baa-ac35-b9d9b20bf1d4","Type":"ContainerStarted","Data":"2859f501b3e03ddd4cb57ad57b070d2240e192193db237383d2c1d373be70a7a"} Feb 28 04:22:55 crc kubenswrapper[5072]: I0228 04:22:55.395793 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" event={"ID":"80296180-6b5d-4baa-ac35-b9d9b20bf1d4","Type":"ContainerStarted","Data":"a242a9145453002c30f4f771750f5c5554c220ab5a4833963aa9ee4179b42a18"} Feb 28 04:22:58 crc kubenswrapper[5072]: I0228 04:22:58.418905 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" event={"ID":"80296180-6b5d-4baa-ac35-b9d9b20bf1d4","Type":"ContainerStarted","Data":"53777e9d3a3c59027ed738ca7eca9d66a0c8a0cc8a8caf363cbaaea51b8d0836"} Feb 28 04:22:58 crc kubenswrapper[5072]: I0228 04:22:58.420226 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:58 crc kubenswrapper[5072]: I0228 04:22:58.450129 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:58 crc kubenswrapper[5072]: I0228 04:22:58.475486 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" podStartSLOduration=7.4754663820000005 podStartE2EDuration="7.475466382s" podCreationTimestamp="2026-02-28 04:22:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:22:58.444113408 +0000 UTC m=+800.438843600" watchObservedRunningTime="2026-02-28 04:22:58.475466382 +0000 UTC m=+800.470196574" Feb 28 04:22:59 crc kubenswrapper[5072]: I0228 04:22:59.424339 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:59 crc kubenswrapper[5072]: I0228 04:22:59.424690 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:22:59 crc kubenswrapper[5072]: I0228 04:22:59.485857 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:23:01 crc kubenswrapper[5072]: I0228 04:23:01.090829 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt"] Feb 28 04:23:01 crc kubenswrapper[5072]: I0228 04:23:01.094839 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" Feb 28 04:23:01 crc kubenswrapper[5072]: I0228 04:23:01.095088 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt"] Feb 28 04:23:01 crc kubenswrapper[5072]: I0228 04:23:01.096952 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 28 04:23:01 crc kubenswrapper[5072]: I0228 04:23:01.183106 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c8b2c327-09b5-479e-b2e9-8edb01862f59-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt\" (UID: \"c8b2c327-09b5-479e-b2e9-8edb01862f59\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" Feb 28 04:23:01 crc kubenswrapper[5072]: I0228 04:23:01.183177 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c8b2c327-09b5-479e-b2e9-8edb01862f59-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt\" (UID: \"c8b2c327-09b5-479e-b2e9-8edb01862f59\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" Feb 28 04:23:01 crc kubenswrapper[5072]: I0228 04:23:01.183212 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcqgt\" (UniqueName: \"kubernetes.io/projected/c8b2c327-09b5-479e-b2e9-8edb01862f59-kube-api-access-pcqgt\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt\" (UID: \"c8b2c327-09b5-479e-b2e9-8edb01862f59\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" Feb 28 04:23:01 crc kubenswrapper[5072]: I0228 04:23:01.285147 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c8b2c327-09b5-479e-b2e9-8edb01862f59-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt\" (UID: \"c8b2c327-09b5-479e-b2e9-8edb01862f59\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" Feb 28 04:23:01 crc kubenswrapper[5072]: I0228 04:23:01.285267 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c8b2c327-09b5-479e-b2e9-8edb01862f59-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt\" (UID: \"c8b2c327-09b5-479e-b2e9-8edb01862f59\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" Feb 28 04:23:01 crc kubenswrapper[5072]: I0228 04:23:01.285325 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcqgt\" (UniqueName: \"kubernetes.io/projected/c8b2c327-09b5-479e-b2e9-8edb01862f59-kube-api-access-pcqgt\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt\" (UID: \"c8b2c327-09b5-479e-b2e9-8edb01862f59\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" Feb 28 04:23:01 crc kubenswrapper[5072]: I0228 04:23:01.285830 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c8b2c327-09b5-479e-b2e9-8edb01862f59-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt\" (UID: \"c8b2c327-09b5-479e-b2e9-8edb01862f59\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" Feb 28 04:23:01 crc kubenswrapper[5072]: I0228 04:23:01.285876 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c8b2c327-09b5-479e-b2e9-8edb01862f59-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt\" (UID: \"c8b2c327-09b5-479e-b2e9-8edb01862f59\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" Feb 28 04:23:01 crc kubenswrapper[5072]: I0228 04:23:01.305608 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcqgt\" (UniqueName: \"kubernetes.io/projected/c8b2c327-09b5-479e-b2e9-8edb01862f59-kube-api-access-pcqgt\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt\" (UID: \"c8b2c327-09b5-479e-b2e9-8edb01862f59\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" Feb 28 04:23:01 crc kubenswrapper[5072]: I0228 04:23:01.423264 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" Feb 28 04:23:01 crc kubenswrapper[5072]: E0228 04:23:01.453098 5072 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_openshift-marketplace_c8b2c327-09b5-479e-b2e9-8edb01862f59_0(26b53d137de141d787a3c25d222bcee4ad9fc651a33644cc2b38c8a350ff2afa): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 04:23:01 crc kubenswrapper[5072]: E0228 04:23:01.453170 5072 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_openshift-marketplace_c8b2c327-09b5-479e-b2e9-8edb01862f59_0(26b53d137de141d787a3c25d222bcee4ad9fc651a33644cc2b38c8a350ff2afa): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" Feb 28 04:23:01 crc kubenswrapper[5072]: E0228 04:23:01.453195 5072 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_openshift-marketplace_c8b2c327-09b5-479e-b2e9-8edb01862f59_0(26b53d137de141d787a3c25d222bcee4ad9fc651a33644cc2b38c8a350ff2afa): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" Feb 28 04:23:01 crc kubenswrapper[5072]: E0228 04:23:01.453301 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_openshift-marketplace(c8b2c327-09b5-479e-b2e9-8edb01862f59)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_openshift-marketplace(c8b2c327-09b5-479e-b2e9-8edb01862f59)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_openshift-marketplace_c8b2c327-09b5-479e-b2e9-8edb01862f59_0(26b53d137de141d787a3c25d222bcee4ad9fc651a33644cc2b38c8a350ff2afa): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" podUID="c8b2c327-09b5-479e-b2e9-8edb01862f59" Feb 28 04:23:01 crc kubenswrapper[5072]: I0228 04:23:01.658980 5072 scope.go:117] "RemoveContainer" containerID="7e741ee8743c0d8ce1eff62104ff5ebc16f00b9727d0e70f0a3c873cc615ed38" Feb 28 04:23:01 crc kubenswrapper[5072]: E0228 04:23:01.659231 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-8pz98_openshift-multus(ae699423-376d-4342-bf44-7d70f68fadd1)\"" pod="openshift-multus/multus-8pz98" podUID="ae699423-376d-4342-bf44-7d70f68fadd1" Feb 28 04:23:02 crc kubenswrapper[5072]: I0228 04:23:02.439883 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" Feb 28 04:23:02 crc kubenswrapper[5072]: I0228 04:23:02.440766 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" Feb 28 04:23:02 crc kubenswrapper[5072]: E0228 04:23:02.465659 5072 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_openshift-marketplace_c8b2c327-09b5-479e-b2e9-8edb01862f59_0(34d0158bf74a9bc2552e40973aac719402163ac9d973081fcbfcc04404c5341f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 04:23:02 crc kubenswrapper[5072]: E0228 04:23:02.465746 5072 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_openshift-marketplace_c8b2c327-09b5-479e-b2e9-8edb01862f59_0(34d0158bf74a9bc2552e40973aac719402163ac9d973081fcbfcc04404c5341f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" Feb 28 04:23:02 crc kubenswrapper[5072]: E0228 04:23:02.465780 5072 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_openshift-marketplace_c8b2c327-09b5-479e-b2e9-8edb01862f59_0(34d0158bf74a9bc2552e40973aac719402163ac9d973081fcbfcc04404c5341f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" Feb 28 04:23:02 crc kubenswrapper[5072]: E0228 04:23:02.465840 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_openshift-marketplace(c8b2c327-09b5-479e-b2e9-8edb01862f59)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_openshift-marketplace(c8b2c327-09b5-479e-b2e9-8edb01862f59)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_openshift-marketplace_c8b2c327-09b5-479e-b2e9-8edb01862f59_0(34d0158bf74a9bc2552e40973aac719402163ac9d973081fcbfcc04404c5341f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" podUID="c8b2c327-09b5-479e-b2e9-8edb01862f59" Feb 28 04:23:14 crc kubenswrapper[5072]: I0228 04:23:14.658069 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" Feb 28 04:23:14 crc kubenswrapper[5072]: I0228 04:23:14.658761 5072 scope.go:117] "RemoveContainer" containerID="7e741ee8743c0d8ce1eff62104ff5ebc16f00b9727d0e70f0a3c873cc615ed38" Feb 28 04:23:14 crc kubenswrapper[5072]: I0228 04:23:14.659605 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" Feb 28 04:23:14 crc kubenswrapper[5072]: E0228 04:23:14.709232 5072 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_openshift-marketplace_c8b2c327-09b5-479e-b2e9-8edb01862f59_0(d1a173808bc6e6e7936a433b4e71623e6442b40a339067ed96323cac714bab14): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 04:23:14 crc kubenswrapper[5072]: E0228 04:23:14.709759 5072 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_openshift-marketplace_c8b2c327-09b5-479e-b2e9-8edb01862f59_0(d1a173808bc6e6e7936a433b4e71623e6442b40a339067ed96323cac714bab14): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" Feb 28 04:23:14 crc kubenswrapper[5072]: E0228 04:23:14.709832 5072 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_openshift-marketplace_c8b2c327-09b5-479e-b2e9-8edb01862f59_0(d1a173808bc6e6e7936a433b4e71623e6442b40a339067ed96323cac714bab14): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" Feb 28 04:23:14 crc kubenswrapper[5072]: E0228 04:23:14.709954 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_openshift-marketplace(c8b2c327-09b5-479e-b2e9-8edb01862f59)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_openshift-marketplace(c8b2c327-09b5-479e-b2e9-8edb01862f59)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_openshift-marketplace_c8b2c327-09b5-479e-b2e9-8edb01862f59_0(d1a173808bc6e6e7936a433b4e71623e6442b40a339067ed96323cac714bab14): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" podUID="c8b2c327-09b5-479e-b2e9-8edb01862f59" Feb 28 04:23:15 crc kubenswrapper[5072]: I0228 04:23:15.522619 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8pz98_ae699423-376d-4342-bf44-7d70f68fadd1/kube-multus/2.log" Feb 28 04:23:15 crc kubenswrapper[5072]: I0228 04:23:15.522960 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8pz98" event={"ID":"ae699423-376d-4342-bf44-7d70f68fadd1","Type":"ContainerStarted","Data":"27a16b0218175b939405aaebfab38f1b2be80d59e43b1d4a6313cc1351cb7ba6"} Feb 28 04:23:20 crc kubenswrapper[5072]: I0228 04:23:20.105862 5072 patch_prober.go:28] interesting pod/machine-config-daemon-5lrpf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:23:20 crc kubenswrapper[5072]: I0228 04:23:20.106213 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:23:21 crc kubenswrapper[5072]: I0228 04:23:21.799097 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-brbjn" Feb 28 04:23:27 crc kubenswrapper[5072]: I0228 04:23:27.378382 5072 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 28 04:23:29 crc kubenswrapper[5072]: I0228 04:23:29.658261 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" Feb 28 04:23:29 crc kubenswrapper[5072]: I0228 04:23:29.659060 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" Feb 28 04:23:29 crc kubenswrapper[5072]: I0228 04:23:29.870801 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt"] Feb 28 04:23:29 crc kubenswrapper[5072]: W0228 04:23:29.877890 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8b2c327_09b5_479e_b2e9_8edb01862f59.slice/crio-78bc323d88fb835c0a36b35e82195307b35834ec36c036f905b43c38e6937656 WatchSource:0}: Error finding container 78bc323d88fb835c0a36b35e82195307b35834ec36c036f905b43c38e6937656: Status 404 returned error can't find the container with id 78bc323d88fb835c0a36b35e82195307b35834ec36c036f905b43c38e6937656 Feb 28 04:23:30 crc kubenswrapper[5072]: I0228 04:23:30.608793 5072 generic.go:334] "Generic (PLEG): container finished" podID="c8b2c327-09b5-479e-b2e9-8edb01862f59" containerID="eebdb6a88a8a8b53aecef79a7492077984f5e5a55727ae769f2bf5da06203a41" exitCode=0 Feb 28 04:23:30 crc kubenswrapper[5072]: I0228 04:23:30.608852 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" event={"ID":"c8b2c327-09b5-479e-b2e9-8edb01862f59","Type":"ContainerDied","Data":"eebdb6a88a8a8b53aecef79a7492077984f5e5a55727ae769f2bf5da06203a41"} Feb 28 04:23:30 crc kubenswrapper[5072]: I0228 04:23:30.608885 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" event={"ID":"c8b2c327-09b5-479e-b2e9-8edb01862f59","Type":"ContainerStarted","Data":"78bc323d88fb835c0a36b35e82195307b35834ec36c036f905b43c38e6937656"} Feb 28 04:23:32 crc kubenswrapper[5072]: I0228 04:23:32.623634 5072 generic.go:334] "Generic (PLEG): container finished" podID="c8b2c327-09b5-479e-b2e9-8edb01862f59" containerID="862f2991b8f3eb5f9b503f21c8d0c1b9e278a45ec7fdcc1b39632ab8dea14f41" exitCode=0 Feb 28 04:23:32 crc kubenswrapper[5072]: I0228 04:23:32.623687 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" event={"ID":"c8b2c327-09b5-479e-b2e9-8edb01862f59","Type":"ContainerDied","Data":"862f2991b8f3eb5f9b503f21c8d0c1b9e278a45ec7fdcc1b39632ab8dea14f41"} Feb 28 04:23:32 crc kubenswrapper[5072]: I0228 04:23:32.871628 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fhpln"] Feb 28 04:23:32 crc kubenswrapper[5072]: I0228 04:23:32.872957 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fhpln" Feb 28 04:23:32 crc kubenswrapper[5072]: I0228 04:23:32.883699 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fhpln"] Feb 28 04:23:32 crc kubenswrapper[5072]: I0228 04:23:32.911205 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36e43a8c-291f-4213-9c4f-eb20aa080e56-catalog-content\") pod \"redhat-operators-fhpln\" (UID: \"36e43a8c-291f-4213-9c4f-eb20aa080e56\") " pod="openshift-marketplace/redhat-operators-fhpln" Feb 28 04:23:32 crc kubenswrapper[5072]: I0228 04:23:32.911265 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36e43a8c-291f-4213-9c4f-eb20aa080e56-utilities\") pod \"redhat-operators-fhpln\" (UID: \"36e43a8c-291f-4213-9c4f-eb20aa080e56\") " pod="openshift-marketplace/redhat-operators-fhpln" Feb 28 04:23:32 crc kubenswrapper[5072]: I0228 04:23:32.911291 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw2l6\" (UniqueName: \"kubernetes.io/projected/36e43a8c-291f-4213-9c4f-eb20aa080e56-kube-api-access-fw2l6\") pod \"redhat-operators-fhpln\" (UID: \"36e43a8c-291f-4213-9c4f-eb20aa080e56\") " pod="openshift-marketplace/redhat-operators-fhpln" Feb 28 04:23:33 crc kubenswrapper[5072]: I0228 04:23:33.012831 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36e43a8c-291f-4213-9c4f-eb20aa080e56-utilities\") pod \"redhat-operators-fhpln\" (UID: \"36e43a8c-291f-4213-9c4f-eb20aa080e56\") " pod="openshift-marketplace/redhat-operators-fhpln" Feb 28 04:23:33 crc kubenswrapper[5072]: I0228 04:23:33.012884 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw2l6\" (UniqueName: \"kubernetes.io/projected/36e43a8c-291f-4213-9c4f-eb20aa080e56-kube-api-access-fw2l6\") pod \"redhat-operators-fhpln\" (UID: \"36e43a8c-291f-4213-9c4f-eb20aa080e56\") " pod="openshift-marketplace/redhat-operators-fhpln" Feb 28 04:23:33 crc kubenswrapper[5072]: I0228 04:23:33.012945 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36e43a8c-291f-4213-9c4f-eb20aa080e56-catalog-content\") pod \"redhat-operators-fhpln\" (UID: \"36e43a8c-291f-4213-9c4f-eb20aa080e56\") " pod="openshift-marketplace/redhat-operators-fhpln" Feb 28 04:23:33 crc kubenswrapper[5072]: I0228 04:23:33.013384 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36e43a8c-291f-4213-9c4f-eb20aa080e56-utilities\") pod \"redhat-operators-fhpln\" (UID: \"36e43a8c-291f-4213-9c4f-eb20aa080e56\") " pod="openshift-marketplace/redhat-operators-fhpln" Feb 28 04:23:33 crc kubenswrapper[5072]: I0228 04:23:33.013414 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36e43a8c-291f-4213-9c4f-eb20aa080e56-catalog-content\") pod \"redhat-operators-fhpln\" (UID: \"36e43a8c-291f-4213-9c4f-eb20aa080e56\") " pod="openshift-marketplace/redhat-operators-fhpln" Feb 28 04:23:33 crc kubenswrapper[5072]: I0228 04:23:33.032041 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw2l6\" (UniqueName: \"kubernetes.io/projected/36e43a8c-291f-4213-9c4f-eb20aa080e56-kube-api-access-fw2l6\") pod \"redhat-operators-fhpln\" (UID: \"36e43a8c-291f-4213-9c4f-eb20aa080e56\") " pod="openshift-marketplace/redhat-operators-fhpln" Feb 28 04:23:33 crc kubenswrapper[5072]: I0228 04:23:33.257401 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fhpln" Feb 28 04:23:33 crc kubenswrapper[5072]: I0228 04:23:33.436840 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fhpln"] Feb 28 04:23:33 crc kubenswrapper[5072]: W0228 04:23:33.445389 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36e43a8c_291f_4213_9c4f_eb20aa080e56.slice/crio-b571310a1b5f61495b090deb293c698957119f56306ce415d74a66a738fe3c85 WatchSource:0}: Error finding container b571310a1b5f61495b090deb293c698957119f56306ce415d74a66a738fe3c85: Status 404 returned error can't find the container with id b571310a1b5f61495b090deb293c698957119f56306ce415d74a66a738fe3c85 Feb 28 04:23:33 crc kubenswrapper[5072]: I0228 04:23:33.629983 5072 generic.go:334] "Generic (PLEG): container finished" podID="36e43a8c-291f-4213-9c4f-eb20aa080e56" containerID="b293c6ab2cd7f17509d56ba16f7ce3e51ee91dafe604b56d506212c94c5211cc" exitCode=0 Feb 28 04:23:33 crc kubenswrapper[5072]: I0228 04:23:33.630040 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhpln" event={"ID":"36e43a8c-291f-4213-9c4f-eb20aa080e56","Type":"ContainerDied","Data":"b293c6ab2cd7f17509d56ba16f7ce3e51ee91dafe604b56d506212c94c5211cc"} Feb 28 04:23:33 crc kubenswrapper[5072]: I0228 04:23:33.630086 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhpln" event={"ID":"36e43a8c-291f-4213-9c4f-eb20aa080e56","Type":"ContainerStarted","Data":"b571310a1b5f61495b090deb293c698957119f56306ce415d74a66a738fe3c85"} Feb 28 04:23:33 crc kubenswrapper[5072]: I0228 04:23:33.633535 5072 generic.go:334] "Generic (PLEG): container finished" podID="c8b2c327-09b5-479e-b2e9-8edb01862f59" containerID="b4ea2218c537dbed5bfc729bc5196a83b3a9084d3ad89219cebd2b50d4f4dc46" exitCode=0 Feb 28 04:23:33 crc kubenswrapper[5072]: I0228 04:23:33.633574 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" event={"ID":"c8b2c327-09b5-479e-b2e9-8edb01862f59","Type":"ContainerDied","Data":"b4ea2218c537dbed5bfc729bc5196a83b3a9084d3ad89219cebd2b50d4f4dc46"} Feb 28 04:23:34 crc kubenswrapper[5072]: I0228 04:23:34.643418 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhpln" event={"ID":"36e43a8c-291f-4213-9c4f-eb20aa080e56","Type":"ContainerStarted","Data":"77609e7c15c985fd05e1b3ca45d52c4a5b5bdc28b9722ad4532e67a228456911"} Feb 28 04:23:34 crc kubenswrapper[5072]: I0228 04:23:34.870315 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" Feb 28 04:23:34 crc kubenswrapper[5072]: I0228 04:23:34.934564 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcqgt\" (UniqueName: \"kubernetes.io/projected/c8b2c327-09b5-479e-b2e9-8edb01862f59-kube-api-access-pcqgt\") pod \"c8b2c327-09b5-479e-b2e9-8edb01862f59\" (UID: \"c8b2c327-09b5-479e-b2e9-8edb01862f59\") " Feb 28 04:23:34 crc kubenswrapper[5072]: I0228 04:23:34.934725 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c8b2c327-09b5-479e-b2e9-8edb01862f59-bundle\") pod \"c8b2c327-09b5-479e-b2e9-8edb01862f59\" (UID: \"c8b2c327-09b5-479e-b2e9-8edb01862f59\") " Feb 28 04:23:34 crc kubenswrapper[5072]: I0228 04:23:34.934764 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c8b2c327-09b5-479e-b2e9-8edb01862f59-util\") pod \"c8b2c327-09b5-479e-b2e9-8edb01862f59\" (UID: \"c8b2c327-09b5-479e-b2e9-8edb01862f59\") " Feb 28 04:23:34 crc kubenswrapper[5072]: I0228 04:23:34.935592 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8b2c327-09b5-479e-b2e9-8edb01862f59-bundle" (OuterVolumeSpecName: "bundle") pod "c8b2c327-09b5-479e-b2e9-8edb01862f59" (UID: "c8b2c327-09b5-479e-b2e9-8edb01862f59"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:23:34 crc kubenswrapper[5072]: I0228 04:23:34.939990 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8b2c327-09b5-479e-b2e9-8edb01862f59-kube-api-access-pcqgt" (OuterVolumeSpecName: "kube-api-access-pcqgt") pod "c8b2c327-09b5-479e-b2e9-8edb01862f59" (UID: "c8b2c327-09b5-479e-b2e9-8edb01862f59"). InnerVolumeSpecName "kube-api-access-pcqgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:23:34 crc kubenswrapper[5072]: I0228 04:23:34.964160 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8b2c327-09b5-479e-b2e9-8edb01862f59-util" (OuterVolumeSpecName: "util") pod "c8b2c327-09b5-479e-b2e9-8edb01862f59" (UID: "c8b2c327-09b5-479e-b2e9-8edb01862f59"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:23:35 crc kubenswrapper[5072]: I0228 04:23:35.036020 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcqgt\" (UniqueName: \"kubernetes.io/projected/c8b2c327-09b5-479e-b2e9-8edb01862f59-kube-api-access-pcqgt\") on node \"crc\" DevicePath \"\"" Feb 28 04:23:35 crc kubenswrapper[5072]: I0228 04:23:35.036071 5072 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c8b2c327-09b5-479e-b2e9-8edb01862f59-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 04:23:35 crc kubenswrapper[5072]: I0228 04:23:35.036080 5072 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c8b2c327-09b5-479e-b2e9-8edb01862f59-util\") on node \"crc\" DevicePath \"\"" Feb 28 04:23:35 crc kubenswrapper[5072]: I0228 04:23:35.649280 5072 generic.go:334] "Generic (PLEG): container finished" podID="36e43a8c-291f-4213-9c4f-eb20aa080e56" containerID="77609e7c15c985fd05e1b3ca45d52c4a5b5bdc28b9722ad4532e67a228456911" exitCode=0 Feb 28 04:23:35 crc kubenswrapper[5072]: I0228 04:23:35.649327 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhpln" event={"ID":"36e43a8c-291f-4213-9c4f-eb20aa080e56","Type":"ContainerDied","Data":"77609e7c15c985fd05e1b3ca45d52c4a5b5bdc28b9722ad4532e67a228456911"} Feb 28 04:23:35 crc kubenswrapper[5072]: I0228 04:23:35.654397 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" event={"ID":"c8b2c327-09b5-479e-b2e9-8edb01862f59","Type":"ContainerDied","Data":"78bc323d88fb835c0a36b35e82195307b35834ec36c036f905b43c38e6937656"} Feb 28 04:23:35 crc kubenswrapper[5072]: I0228 04:23:35.654430 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78bc323d88fb835c0a36b35e82195307b35834ec36c036f905b43c38e6937656" Feb 28 04:23:35 crc kubenswrapper[5072]: I0228 04:23:35.654459 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt" Feb 28 04:23:37 crc kubenswrapper[5072]: I0228 04:23:37.667366 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhpln" event={"ID":"36e43a8c-291f-4213-9c4f-eb20aa080e56","Type":"ContainerStarted","Data":"04278a576f507059907ca3aea8202eeada8e67c00bc5556508c5f43830487a9d"} Feb 28 04:23:37 crc kubenswrapper[5072]: I0228 04:23:37.687705 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fhpln" podStartSLOduration=2.708569068 podStartE2EDuration="5.687626385s" podCreationTimestamp="2026-02-28 04:23:32 +0000 UTC" firstStartedPulling="2026-02-28 04:23:33.631453654 +0000 UTC m=+835.626183846" lastFinishedPulling="2026-02-28 04:23:36.610510971 +0000 UTC m=+838.605241163" observedRunningTime="2026-02-28 04:23:37.683052991 +0000 UTC m=+839.677783183" watchObservedRunningTime="2026-02-28 04:23:37.687626385 +0000 UTC m=+839.682356577" Feb 28 04:23:42 crc kubenswrapper[5072]: I0228 04:23:42.538051 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-85768d6f57-5rpmg"] Feb 28 04:23:42 crc kubenswrapper[5072]: E0228 04:23:42.538713 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8b2c327-09b5-479e-b2e9-8edb01862f59" containerName="pull" Feb 28 04:23:42 crc kubenswrapper[5072]: I0228 04:23:42.538731 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8b2c327-09b5-479e-b2e9-8edb01862f59" containerName="pull" Feb 28 04:23:42 crc kubenswrapper[5072]: E0228 04:23:42.538740 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8b2c327-09b5-479e-b2e9-8edb01862f59" containerName="util" Feb 28 04:23:42 crc kubenswrapper[5072]: I0228 04:23:42.538745 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8b2c327-09b5-479e-b2e9-8edb01862f59" containerName="util" Feb 28 04:23:42 crc kubenswrapper[5072]: E0228 04:23:42.538760 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8b2c327-09b5-479e-b2e9-8edb01862f59" containerName="extract" Feb 28 04:23:42 crc kubenswrapper[5072]: I0228 04:23:42.538767 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8b2c327-09b5-479e-b2e9-8edb01862f59" containerName="extract" Feb 28 04:23:42 crc kubenswrapper[5072]: I0228 04:23:42.538857 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8b2c327-09b5-479e-b2e9-8edb01862f59" containerName="extract" Feb 28 04:23:42 crc kubenswrapper[5072]: I0228 04:23:42.539327 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-85768d6f57-5rpmg" Feb 28 04:23:42 crc kubenswrapper[5072]: I0228 04:23:42.541515 5072 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 28 04:23:42 crc kubenswrapper[5072]: I0228 04:23:42.541743 5072 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-hknfd" Feb 28 04:23:42 crc kubenswrapper[5072]: I0228 04:23:42.543814 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 28 04:23:42 crc kubenswrapper[5072]: I0228 04:23:42.544028 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 28 04:23:42 crc kubenswrapper[5072]: I0228 04:23:42.544186 5072 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 28 04:23:42 crc kubenswrapper[5072]: I0228 04:23:42.561706 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-85768d6f57-5rpmg"] Feb 28 04:23:42 crc kubenswrapper[5072]: I0228 04:23:42.729541 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/66660768-8bc9-40af-baab-529d0820c10b-webhook-cert\") pod \"metallb-operator-controller-manager-85768d6f57-5rpmg\" (UID: \"66660768-8bc9-40af-baab-529d0820c10b\") " pod="metallb-system/metallb-operator-controller-manager-85768d6f57-5rpmg" Feb 28 04:23:42 crc kubenswrapper[5072]: I0228 04:23:42.729609 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd7bf\" (UniqueName: \"kubernetes.io/projected/66660768-8bc9-40af-baab-529d0820c10b-kube-api-access-sd7bf\") pod \"metallb-operator-controller-manager-85768d6f57-5rpmg\" (UID: \"66660768-8bc9-40af-baab-529d0820c10b\") " pod="metallb-system/metallb-operator-controller-manager-85768d6f57-5rpmg" Feb 28 04:23:42 crc kubenswrapper[5072]: I0228 04:23:42.729757 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/66660768-8bc9-40af-baab-529d0820c10b-apiservice-cert\") pod \"metallb-operator-controller-manager-85768d6f57-5rpmg\" (UID: \"66660768-8bc9-40af-baab-529d0820c10b\") " pod="metallb-system/metallb-operator-controller-manager-85768d6f57-5rpmg" Feb 28 04:23:42 crc kubenswrapper[5072]: I0228 04:23:42.830626 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/66660768-8bc9-40af-baab-529d0820c10b-apiservice-cert\") pod \"metallb-operator-controller-manager-85768d6f57-5rpmg\" (UID: \"66660768-8bc9-40af-baab-529d0820c10b\") " pod="metallb-system/metallb-operator-controller-manager-85768d6f57-5rpmg" Feb 28 04:23:42 crc kubenswrapper[5072]: I0228 04:23:42.830714 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/66660768-8bc9-40af-baab-529d0820c10b-webhook-cert\") pod \"metallb-operator-controller-manager-85768d6f57-5rpmg\" (UID: \"66660768-8bc9-40af-baab-529d0820c10b\") " pod="metallb-system/metallb-operator-controller-manager-85768d6f57-5rpmg" Feb 28 04:23:42 crc kubenswrapper[5072]: I0228 04:23:42.830732 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd7bf\" (UniqueName: \"kubernetes.io/projected/66660768-8bc9-40af-baab-529d0820c10b-kube-api-access-sd7bf\") pod \"metallb-operator-controller-manager-85768d6f57-5rpmg\" (UID: \"66660768-8bc9-40af-baab-529d0820c10b\") " pod="metallb-system/metallb-operator-controller-manager-85768d6f57-5rpmg" Feb 28 04:23:42 crc kubenswrapper[5072]: I0228 04:23:42.836463 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/66660768-8bc9-40af-baab-529d0820c10b-webhook-cert\") pod \"metallb-operator-controller-manager-85768d6f57-5rpmg\" (UID: \"66660768-8bc9-40af-baab-529d0820c10b\") " pod="metallb-system/metallb-operator-controller-manager-85768d6f57-5rpmg" Feb 28 04:23:42 crc kubenswrapper[5072]: I0228 04:23:42.837278 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/66660768-8bc9-40af-baab-529d0820c10b-apiservice-cert\") pod \"metallb-operator-controller-manager-85768d6f57-5rpmg\" (UID: \"66660768-8bc9-40af-baab-529d0820c10b\") " pod="metallb-system/metallb-operator-controller-manager-85768d6f57-5rpmg" Feb 28 04:23:42 crc kubenswrapper[5072]: I0228 04:23:42.848985 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd7bf\" (UniqueName: \"kubernetes.io/projected/66660768-8bc9-40af-baab-529d0820c10b-kube-api-access-sd7bf\") pod \"metallb-operator-controller-manager-85768d6f57-5rpmg\" (UID: \"66660768-8bc9-40af-baab-529d0820c10b\") " pod="metallb-system/metallb-operator-controller-manager-85768d6f57-5rpmg" Feb 28 04:23:42 crc kubenswrapper[5072]: I0228 04:23:42.861060 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6b95579fd-hmq5d"] Feb 28 04:23:42 crc kubenswrapper[5072]: I0228 04:23:42.861973 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6b95579fd-hmq5d" Feb 28 04:23:42 crc kubenswrapper[5072]: I0228 04:23:42.864852 5072 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-kjv9c" Feb 28 04:23:42 crc kubenswrapper[5072]: I0228 04:23:42.864972 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-85768d6f57-5rpmg" Feb 28 04:23:42 crc kubenswrapper[5072]: I0228 04:23:42.865092 5072 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 28 04:23:42 crc kubenswrapper[5072]: I0228 04:23:42.865344 5072 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 28 04:23:42 crc kubenswrapper[5072]: I0228 04:23:42.882497 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6b95579fd-hmq5d"] Feb 28 04:23:43 crc kubenswrapper[5072]: I0228 04:23:43.033967 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8qfn\" (UniqueName: \"kubernetes.io/projected/67512bfe-55b8-4df0-aa98-54225fc624a3-kube-api-access-p8qfn\") pod \"metallb-operator-webhook-server-6b95579fd-hmq5d\" (UID: \"67512bfe-55b8-4df0-aa98-54225fc624a3\") " pod="metallb-system/metallb-operator-webhook-server-6b95579fd-hmq5d" Feb 28 04:23:43 crc kubenswrapper[5072]: I0228 04:23:43.034280 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/67512bfe-55b8-4df0-aa98-54225fc624a3-apiservice-cert\") pod \"metallb-operator-webhook-server-6b95579fd-hmq5d\" (UID: \"67512bfe-55b8-4df0-aa98-54225fc624a3\") " pod="metallb-system/metallb-operator-webhook-server-6b95579fd-hmq5d" Feb 28 04:23:43 crc kubenswrapper[5072]: I0228 04:23:43.034303 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/67512bfe-55b8-4df0-aa98-54225fc624a3-webhook-cert\") pod \"metallb-operator-webhook-server-6b95579fd-hmq5d\" (UID: \"67512bfe-55b8-4df0-aa98-54225fc624a3\") " pod="metallb-system/metallb-operator-webhook-server-6b95579fd-hmq5d" Feb 28 04:23:43 crc kubenswrapper[5072]: I0228 04:23:43.119683 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-85768d6f57-5rpmg"] Feb 28 04:23:43 crc kubenswrapper[5072]: I0228 04:23:43.142426 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8qfn\" (UniqueName: \"kubernetes.io/projected/67512bfe-55b8-4df0-aa98-54225fc624a3-kube-api-access-p8qfn\") pod \"metallb-operator-webhook-server-6b95579fd-hmq5d\" (UID: \"67512bfe-55b8-4df0-aa98-54225fc624a3\") " pod="metallb-system/metallb-operator-webhook-server-6b95579fd-hmq5d" Feb 28 04:23:43 crc kubenswrapper[5072]: I0228 04:23:43.142490 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/67512bfe-55b8-4df0-aa98-54225fc624a3-apiservice-cert\") pod \"metallb-operator-webhook-server-6b95579fd-hmq5d\" (UID: \"67512bfe-55b8-4df0-aa98-54225fc624a3\") " pod="metallb-system/metallb-operator-webhook-server-6b95579fd-hmq5d" Feb 28 04:23:43 crc kubenswrapper[5072]: I0228 04:23:43.142510 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/67512bfe-55b8-4df0-aa98-54225fc624a3-webhook-cert\") pod \"metallb-operator-webhook-server-6b95579fd-hmq5d\" (UID: \"67512bfe-55b8-4df0-aa98-54225fc624a3\") " pod="metallb-system/metallb-operator-webhook-server-6b95579fd-hmq5d" Feb 28 04:23:43 crc kubenswrapper[5072]: I0228 04:23:43.148059 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/67512bfe-55b8-4df0-aa98-54225fc624a3-webhook-cert\") pod \"metallb-operator-webhook-server-6b95579fd-hmq5d\" (UID: \"67512bfe-55b8-4df0-aa98-54225fc624a3\") " pod="metallb-system/metallb-operator-webhook-server-6b95579fd-hmq5d" Feb 28 04:23:43 crc kubenswrapper[5072]: I0228 04:23:43.148170 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/67512bfe-55b8-4df0-aa98-54225fc624a3-apiservice-cert\") pod \"metallb-operator-webhook-server-6b95579fd-hmq5d\" (UID: \"67512bfe-55b8-4df0-aa98-54225fc624a3\") " pod="metallb-system/metallb-operator-webhook-server-6b95579fd-hmq5d" Feb 28 04:23:43 crc kubenswrapper[5072]: I0228 04:23:43.159134 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8qfn\" (UniqueName: \"kubernetes.io/projected/67512bfe-55b8-4df0-aa98-54225fc624a3-kube-api-access-p8qfn\") pod \"metallb-operator-webhook-server-6b95579fd-hmq5d\" (UID: \"67512bfe-55b8-4df0-aa98-54225fc624a3\") " pod="metallb-system/metallb-operator-webhook-server-6b95579fd-hmq5d" Feb 28 04:23:43 crc kubenswrapper[5072]: I0228 04:23:43.220213 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6b95579fd-hmq5d" Feb 28 04:23:43 crc kubenswrapper[5072]: I0228 04:23:43.257665 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fhpln" Feb 28 04:23:43 crc kubenswrapper[5072]: I0228 04:23:43.258032 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fhpln" Feb 28 04:23:43 crc kubenswrapper[5072]: I0228 04:23:43.357088 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fhpln" Feb 28 04:23:43 crc kubenswrapper[5072]: I0228 04:23:43.663539 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6b95579fd-hmq5d"] Feb 28 04:23:43 crc kubenswrapper[5072]: W0228 04:23:43.663853 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67512bfe_55b8_4df0_aa98_54225fc624a3.slice/crio-9737037f36cd594281219282edc1575ac52c25be768c90b212686f20891bf85b WatchSource:0}: Error finding container 9737037f36cd594281219282edc1575ac52c25be768c90b212686f20891bf85b: Status 404 returned error can't find the container with id 9737037f36cd594281219282edc1575ac52c25be768c90b212686f20891bf85b Feb 28 04:23:43 crc kubenswrapper[5072]: I0228 04:23:43.698778 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-85768d6f57-5rpmg" event={"ID":"66660768-8bc9-40af-baab-529d0820c10b","Type":"ContainerStarted","Data":"d4e2bf7f426c269e2617bfd53f9d7e7fa67e8d7ab0b2cf1984db6347570a52b2"} Feb 28 04:23:43 crc kubenswrapper[5072]: I0228 04:23:43.699682 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6b95579fd-hmq5d" event={"ID":"67512bfe-55b8-4df0-aa98-54225fc624a3","Type":"ContainerStarted","Data":"9737037f36cd594281219282edc1575ac52c25be768c90b212686f20891bf85b"} Feb 28 04:23:43 crc kubenswrapper[5072]: I0228 04:23:43.738449 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fhpln" Feb 28 04:23:45 crc kubenswrapper[5072]: I0228 04:23:45.653589 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fhpln"] Feb 28 04:23:46 crc kubenswrapper[5072]: I0228 04:23:46.714877 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fhpln" podUID="36e43a8c-291f-4213-9c4f-eb20aa080e56" containerName="registry-server" containerID="cri-o://04278a576f507059907ca3aea8202eeada8e67c00bc5556508c5f43830487a9d" gracePeriod=2 Feb 28 04:23:47 crc kubenswrapper[5072]: I0228 04:23:47.723730 5072 generic.go:334] "Generic (PLEG): container finished" podID="36e43a8c-291f-4213-9c4f-eb20aa080e56" containerID="04278a576f507059907ca3aea8202eeada8e67c00bc5556508c5f43830487a9d" exitCode=0 Feb 28 04:23:47 crc kubenswrapper[5072]: I0228 04:23:47.723787 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhpln" event={"ID":"36e43a8c-291f-4213-9c4f-eb20aa080e56","Type":"ContainerDied","Data":"04278a576f507059907ca3aea8202eeada8e67c00bc5556508c5f43830487a9d"} Feb 28 04:23:50 crc kubenswrapper[5072]: I0228 04:23:50.105322 5072 patch_prober.go:28] interesting pod/machine-config-daemon-5lrpf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:23:50 crc kubenswrapper[5072]: I0228 04:23:50.105707 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:23:50 crc kubenswrapper[5072]: I0228 04:23:50.105778 5072 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" Feb 28 04:23:50 crc kubenswrapper[5072]: I0228 04:23:50.106519 5072 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a7da6d10ce5d74918d539d5f69d6835b46ff28621ce44b337a029f6864cad079"} pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 28 04:23:50 crc kubenswrapper[5072]: I0228 04:23:50.106597 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" containerID="cri-o://a7da6d10ce5d74918d539d5f69d6835b46ff28621ce44b337a029f6864cad079" gracePeriod=600 Feb 28 04:23:50 crc kubenswrapper[5072]: I0228 04:23:50.743335 5072 generic.go:334] "Generic (PLEG): container finished" podID="a035bbab-1d8f-4120-aaf7-88984d936939" containerID="a7da6d10ce5d74918d539d5f69d6835b46ff28621ce44b337a029f6864cad079" exitCode=0 Feb 28 04:23:50 crc kubenswrapper[5072]: I0228 04:23:50.743443 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" event={"ID":"a035bbab-1d8f-4120-aaf7-88984d936939","Type":"ContainerDied","Data":"a7da6d10ce5d74918d539d5f69d6835b46ff28621ce44b337a029f6864cad079"} Feb 28 04:23:50 crc kubenswrapper[5072]: I0228 04:23:50.743758 5072 scope.go:117] "RemoveContainer" containerID="4995e81e3b0a747e3fcdeca516d170f1d3f4a30b7f4d30dcd3a95695e36c9e2c" Feb 28 04:23:50 crc kubenswrapper[5072]: I0228 04:23:50.864763 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fhpln" Feb 28 04:23:50 crc kubenswrapper[5072]: I0228 04:23:50.948538 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36e43a8c-291f-4213-9c4f-eb20aa080e56-catalog-content\") pod \"36e43a8c-291f-4213-9c4f-eb20aa080e56\" (UID: \"36e43a8c-291f-4213-9c4f-eb20aa080e56\") " Feb 28 04:23:50 crc kubenswrapper[5072]: I0228 04:23:50.948873 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36e43a8c-291f-4213-9c4f-eb20aa080e56-utilities\") pod \"36e43a8c-291f-4213-9c4f-eb20aa080e56\" (UID: \"36e43a8c-291f-4213-9c4f-eb20aa080e56\") " Feb 28 04:23:50 crc kubenswrapper[5072]: I0228 04:23:50.948966 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw2l6\" (UniqueName: \"kubernetes.io/projected/36e43a8c-291f-4213-9c4f-eb20aa080e56-kube-api-access-fw2l6\") pod \"36e43a8c-291f-4213-9c4f-eb20aa080e56\" (UID: \"36e43a8c-291f-4213-9c4f-eb20aa080e56\") " Feb 28 04:23:50 crc kubenswrapper[5072]: I0228 04:23:50.950254 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36e43a8c-291f-4213-9c4f-eb20aa080e56-utilities" (OuterVolumeSpecName: "utilities") pod "36e43a8c-291f-4213-9c4f-eb20aa080e56" (UID: "36e43a8c-291f-4213-9c4f-eb20aa080e56"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:23:50 crc kubenswrapper[5072]: I0228 04:23:50.956935 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36e43a8c-291f-4213-9c4f-eb20aa080e56-kube-api-access-fw2l6" (OuterVolumeSpecName: "kube-api-access-fw2l6") pod "36e43a8c-291f-4213-9c4f-eb20aa080e56" (UID: "36e43a8c-291f-4213-9c4f-eb20aa080e56"). InnerVolumeSpecName "kube-api-access-fw2l6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:23:51 crc kubenswrapper[5072]: I0228 04:23:51.050462 5072 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36e43a8c-291f-4213-9c4f-eb20aa080e56-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 04:23:51 crc kubenswrapper[5072]: I0228 04:23:51.050495 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw2l6\" (UniqueName: \"kubernetes.io/projected/36e43a8c-291f-4213-9c4f-eb20aa080e56-kube-api-access-fw2l6\") on node \"crc\" DevicePath \"\"" Feb 28 04:23:51 crc kubenswrapper[5072]: I0228 04:23:51.082175 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36e43a8c-291f-4213-9c4f-eb20aa080e56-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36e43a8c-291f-4213-9c4f-eb20aa080e56" (UID: "36e43a8c-291f-4213-9c4f-eb20aa080e56"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:23:51 crc kubenswrapper[5072]: I0228 04:23:51.152028 5072 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36e43a8c-291f-4213-9c4f-eb20aa080e56-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 04:23:51 crc kubenswrapper[5072]: I0228 04:23:51.751890 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhpln" event={"ID":"36e43a8c-291f-4213-9c4f-eb20aa080e56","Type":"ContainerDied","Data":"b571310a1b5f61495b090deb293c698957119f56306ce415d74a66a738fe3c85"} Feb 28 04:23:51 crc kubenswrapper[5072]: I0228 04:23:51.751945 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fhpln" Feb 28 04:23:51 crc kubenswrapper[5072]: I0228 04:23:51.751971 5072 scope.go:117] "RemoveContainer" containerID="04278a576f507059907ca3aea8202eeada8e67c00bc5556508c5f43830487a9d" Feb 28 04:23:51 crc kubenswrapper[5072]: I0228 04:23:51.754199 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6b95579fd-hmq5d" event={"ID":"67512bfe-55b8-4df0-aa98-54225fc624a3","Type":"ContainerStarted","Data":"13e3144ecb511154d4c726fcfe0f45fa92ebe2f490ecabe1cc7fccdb44b2a203"} Feb 28 04:23:51 crc kubenswrapper[5072]: I0228 04:23:51.755015 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6b95579fd-hmq5d" Feb 28 04:23:51 crc kubenswrapper[5072]: I0228 04:23:51.758866 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" event={"ID":"a035bbab-1d8f-4120-aaf7-88984d936939","Type":"ContainerStarted","Data":"e53c192baa0cf41417fc28e90ae7b328b499a54a241b5398391870c675f33023"} Feb 28 04:23:51 crc kubenswrapper[5072]: I0228 04:23:51.761352 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-85768d6f57-5rpmg" event={"ID":"66660768-8bc9-40af-baab-529d0820c10b","Type":"ContainerStarted","Data":"23a05bbc89418a549d640c625c27452425805b6207f5cebdca2142659bb38c85"} Feb 28 04:23:51 crc kubenswrapper[5072]: I0228 04:23:51.761617 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-85768d6f57-5rpmg" Feb 28 04:23:51 crc kubenswrapper[5072]: I0228 04:23:51.771132 5072 scope.go:117] "RemoveContainer" containerID="77609e7c15c985fd05e1b3ca45d52c4a5b5bdc28b9722ad4532e67a228456911" Feb 28 04:23:51 crc kubenswrapper[5072]: I0228 04:23:51.782528 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6b95579fd-hmq5d" podStartSLOduration=2.562101719 podStartE2EDuration="9.78251409s" podCreationTimestamp="2026-02-28 04:23:42 +0000 UTC" firstStartedPulling="2026-02-28 04:23:43.668989109 +0000 UTC m=+845.663719301" lastFinishedPulling="2026-02-28 04:23:50.88940148 +0000 UTC m=+852.884131672" observedRunningTime="2026-02-28 04:23:51.778789573 +0000 UTC m=+853.773519765" watchObservedRunningTime="2026-02-28 04:23:51.78251409 +0000 UTC m=+853.777244282" Feb 28 04:23:51 crc kubenswrapper[5072]: I0228 04:23:51.809341 5072 scope.go:117] "RemoveContainer" containerID="b293c6ab2cd7f17509d56ba16f7ce3e51ee91dafe604b56d506212c94c5211cc" Feb 28 04:23:51 crc kubenswrapper[5072]: I0228 04:23:51.843460 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-85768d6f57-5rpmg" podStartSLOduration=2.137629865 podStartE2EDuration="9.84343775s" podCreationTimestamp="2026-02-28 04:23:42 +0000 UTC" firstStartedPulling="2026-02-28 04:23:43.146776576 +0000 UTC m=+845.141506758" lastFinishedPulling="2026-02-28 04:23:50.852584451 +0000 UTC m=+852.847314643" observedRunningTime="2026-02-28 04:23:51.840431686 +0000 UTC m=+853.835161878" watchObservedRunningTime="2026-02-28 04:23:51.84343775 +0000 UTC m=+853.838167962" Feb 28 04:23:51 crc kubenswrapper[5072]: I0228 04:23:51.859003 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fhpln"] Feb 28 04:23:51 crc kubenswrapper[5072]: I0228 04:23:51.860150 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fhpln"] Feb 28 04:23:52 crc kubenswrapper[5072]: I0228 04:23:52.666891 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36e43a8c-291f-4213-9c4f-eb20aa080e56" path="/var/lib/kubelet/pods/36e43a8c-291f-4213-9c4f-eb20aa080e56/volumes" Feb 28 04:24:00 crc kubenswrapper[5072]: I0228 04:24:00.144843 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537544-msjzd"] Feb 28 04:24:00 crc kubenswrapper[5072]: E0228 04:24:00.145679 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36e43a8c-291f-4213-9c4f-eb20aa080e56" containerName="registry-server" Feb 28 04:24:00 crc kubenswrapper[5072]: I0228 04:24:00.145696 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="36e43a8c-291f-4213-9c4f-eb20aa080e56" containerName="registry-server" Feb 28 04:24:00 crc kubenswrapper[5072]: E0228 04:24:00.145711 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36e43a8c-291f-4213-9c4f-eb20aa080e56" containerName="extract-content" Feb 28 04:24:00 crc kubenswrapper[5072]: I0228 04:24:00.145719 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="36e43a8c-291f-4213-9c4f-eb20aa080e56" containerName="extract-content" Feb 28 04:24:00 crc kubenswrapper[5072]: E0228 04:24:00.145731 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36e43a8c-291f-4213-9c4f-eb20aa080e56" containerName="extract-utilities" Feb 28 04:24:00 crc kubenswrapper[5072]: I0228 04:24:00.145740 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="36e43a8c-291f-4213-9c4f-eb20aa080e56" containerName="extract-utilities" Feb 28 04:24:00 crc kubenswrapper[5072]: I0228 04:24:00.145863 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="36e43a8c-291f-4213-9c4f-eb20aa080e56" containerName="registry-server" Feb 28 04:24:00 crc kubenswrapper[5072]: I0228 04:24:00.146445 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537544-msjzd" Feb 28 04:24:00 crc kubenswrapper[5072]: I0228 04:24:00.149085 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 04:24:00 crc kubenswrapper[5072]: I0228 04:24:00.157876 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-c8kvx" Feb 28 04:24:00 crc kubenswrapper[5072]: I0228 04:24:00.158798 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 04:24:00 crc kubenswrapper[5072]: I0228 04:24:00.161155 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537544-msjzd"] Feb 28 04:24:00 crc kubenswrapper[5072]: I0228 04:24:00.272323 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lsc6\" (UniqueName: \"kubernetes.io/projected/e01f23d6-12a8-4823-9d45-763f79cca9a8-kube-api-access-6lsc6\") pod \"auto-csr-approver-29537544-msjzd\" (UID: \"e01f23d6-12a8-4823-9d45-763f79cca9a8\") " pod="openshift-infra/auto-csr-approver-29537544-msjzd" Feb 28 04:24:00 crc kubenswrapper[5072]: I0228 04:24:00.373611 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lsc6\" (UniqueName: \"kubernetes.io/projected/e01f23d6-12a8-4823-9d45-763f79cca9a8-kube-api-access-6lsc6\") pod \"auto-csr-approver-29537544-msjzd\" (UID: \"e01f23d6-12a8-4823-9d45-763f79cca9a8\") " pod="openshift-infra/auto-csr-approver-29537544-msjzd" Feb 28 04:24:00 crc kubenswrapper[5072]: I0228 04:24:00.396450 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lsc6\" (UniqueName: \"kubernetes.io/projected/e01f23d6-12a8-4823-9d45-763f79cca9a8-kube-api-access-6lsc6\") pod \"auto-csr-approver-29537544-msjzd\" (UID: \"e01f23d6-12a8-4823-9d45-763f79cca9a8\") " pod="openshift-infra/auto-csr-approver-29537544-msjzd" Feb 28 04:24:00 crc kubenswrapper[5072]: I0228 04:24:00.465359 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537544-msjzd" Feb 28 04:24:00 crc kubenswrapper[5072]: I0228 04:24:00.631354 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537544-msjzd"] Feb 28 04:24:00 crc kubenswrapper[5072]: I0228 04:24:00.825287 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537544-msjzd" event={"ID":"e01f23d6-12a8-4823-9d45-763f79cca9a8","Type":"ContainerStarted","Data":"b59b9fc3a35e4af90c4e8120e9b9b6453f6be95774d49247071978d731b15979"} Feb 28 04:24:02 crc kubenswrapper[5072]: I0228 04:24:02.837136 5072 generic.go:334] "Generic (PLEG): container finished" podID="e01f23d6-12a8-4823-9d45-763f79cca9a8" containerID="307c6a4dfd8b3e4e860aef830a0db6f175c69340657a1e7afe18a4191498f545" exitCode=0 Feb 28 04:24:02 crc kubenswrapper[5072]: I0228 04:24:02.837185 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537544-msjzd" event={"ID":"e01f23d6-12a8-4823-9d45-763f79cca9a8","Type":"ContainerDied","Data":"307c6a4dfd8b3e4e860aef830a0db6f175c69340657a1e7afe18a4191498f545"} Feb 28 04:24:03 crc kubenswrapper[5072]: I0228 04:24:03.226942 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6b95579fd-hmq5d" Feb 28 04:24:04 crc kubenswrapper[5072]: I0228 04:24:04.023399 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537544-msjzd" Feb 28 04:24:04 crc kubenswrapper[5072]: I0228 04:24:04.129025 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lsc6\" (UniqueName: \"kubernetes.io/projected/e01f23d6-12a8-4823-9d45-763f79cca9a8-kube-api-access-6lsc6\") pod \"e01f23d6-12a8-4823-9d45-763f79cca9a8\" (UID: \"e01f23d6-12a8-4823-9d45-763f79cca9a8\") " Feb 28 04:24:04 crc kubenswrapper[5072]: I0228 04:24:04.138884 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e01f23d6-12a8-4823-9d45-763f79cca9a8-kube-api-access-6lsc6" (OuterVolumeSpecName: "kube-api-access-6lsc6") pod "e01f23d6-12a8-4823-9d45-763f79cca9a8" (UID: "e01f23d6-12a8-4823-9d45-763f79cca9a8"). InnerVolumeSpecName "kube-api-access-6lsc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:24:04 crc kubenswrapper[5072]: I0228 04:24:04.230914 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lsc6\" (UniqueName: \"kubernetes.io/projected/e01f23d6-12a8-4823-9d45-763f79cca9a8-kube-api-access-6lsc6\") on node \"crc\" DevicePath \"\"" Feb 28 04:24:04 crc kubenswrapper[5072]: I0228 04:24:04.856325 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537544-msjzd" event={"ID":"e01f23d6-12a8-4823-9d45-763f79cca9a8","Type":"ContainerDied","Data":"b59b9fc3a35e4af90c4e8120e9b9b6453f6be95774d49247071978d731b15979"} Feb 28 04:24:04 crc kubenswrapper[5072]: I0228 04:24:04.856375 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537544-msjzd" Feb 28 04:24:04 crc kubenswrapper[5072]: I0228 04:24:04.856381 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b59b9fc3a35e4af90c4e8120e9b9b6453f6be95774d49247071978d731b15979" Feb 28 04:24:05 crc kubenswrapper[5072]: I0228 04:24:05.065239 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537538-6ldpp"] Feb 28 04:24:05 crc kubenswrapper[5072]: I0228 04:24:05.068822 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537538-6ldpp"] Feb 28 04:24:06 crc kubenswrapper[5072]: I0228 04:24:06.666273 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f9dfb83-81a2-4c25-b386-840173e76451" path="/var/lib/kubelet/pods/6f9dfb83-81a2-4c25-b386-840173e76451/volumes" Feb 28 04:24:20 crc kubenswrapper[5072]: I0228 04:24:20.236947 5072 scope.go:117] "RemoveContainer" containerID="ebc7329fd199392620135b9178139adfb3e4dd155c2803ce4f8a0ed8ffc19df3" Feb 28 04:24:22 crc kubenswrapper[5072]: I0228 04:24:22.869169 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-85768d6f57-5rpmg" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.646397 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-6mp9v"] Feb 28 04:24:23 crc kubenswrapper[5072]: E0228 04:24:23.646797 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e01f23d6-12a8-4823-9d45-763f79cca9a8" containerName="oc" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.646832 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="e01f23d6-12a8-4823-9d45-763f79cca9a8" containerName="oc" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.647079 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="e01f23d6-12a8-4823-9d45-763f79cca9a8" containerName="oc" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.647916 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-6mp9v" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.653206 5072 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-tsdz7" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.653793 5072 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.657108 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-6mp9v"] Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.660278 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-l4kss"] Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.662429 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-l4kss" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.664463 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.666026 5072 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.729875 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-vlqxq"] Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.730669 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-vlqxq" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.732432 5072 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.732442 5072 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.734104 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.734738 5072 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-l9tcr" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.754266 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-wf6c2"] Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.755072 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-wf6c2" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.758991 5072 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.763389 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-wf6c2"] Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.782175 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9783d250-c2b9-4e29-a8d7-94d92d301478-cert\") pod \"frr-k8s-webhook-server-7f989f654f-6mp9v\" (UID: \"9783d250-c2b9-4e29-a8d7-94d92d301478\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-6mp9v" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.782233 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c-metrics-certs\") pod \"speaker-vlqxq\" (UID: \"7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c\") " pod="metallb-system/speaker-vlqxq" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.782273 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c-memberlist\") pod \"speaker-vlqxq\" (UID: \"7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c\") " pod="metallb-system/speaker-vlqxq" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.782295 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/fc3e2173-5582-4edb-b330-fb46053b22e2-frr-sockets\") pod \"frr-k8s-l4kss\" (UID: \"fc3e2173-5582-4edb-b330-fb46053b22e2\") " pod="metallb-system/frr-k8s-l4kss" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.783221 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w68c\" (UniqueName: \"kubernetes.io/projected/9783d250-c2b9-4e29-a8d7-94d92d301478-kube-api-access-6w68c\") pod \"frr-k8s-webhook-server-7f989f654f-6mp9v\" (UID: \"9783d250-c2b9-4e29-a8d7-94d92d301478\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-6mp9v" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.783250 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c-metallb-excludel2\") pod \"speaker-vlqxq\" (UID: \"7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c\") " pod="metallb-system/speaker-vlqxq" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.783288 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96gp7\" (UniqueName: \"kubernetes.io/projected/7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c-kube-api-access-96gp7\") pod \"speaker-vlqxq\" (UID: \"7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c\") " pod="metallb-system/speaker-vlqxq" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.783304 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/fc3e2173-5582-4edb-b330-fb46053b22e2-metrics\") pod \"frr-k8s-l4kss\" (UID: \"fc3e2173-5582-4edb-b330-fb46053b22e2\") " pod="metallb-system/frr-k8s-l4kss" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.783322 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc3e2173-5582-4edb-b330-fb46053b22e2-metrics-certs\") pod \"frr-k8s-l4kss\" (UID: \"fc3e2173-5582-4edb-b330-fb46053b22e2\") " pod="metallb-system/frr-k8s-l4kss" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.783339 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/fc3e2173-5582-4edb-b330-fb46053b22e2-frr-conf\") pod \"frr-k8s-l4kss\" (UID: \"fc3e2173-5582-4edb-b330-fb46053b22e2\") " pod="metallb-system/frr-k8s-l4kss" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.783366 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfqvq\" (UniqueName: \"kubernetes.io/projected/fc3e2173-5582-4edb-b330-fb46053b22e2-kube-api-access-kfqvq\") pod \"frr-k8s-l4kss\" (UID: \"fc3e2173-5582-4edb-b330-fb46053b22e2\") " pod="metallb-system/frr-k8s-l4kss" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.783389 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/fc3e2173-5582-4edb-b330-fb46053b22e2-frr-startup\") pod \"frr-k8s-l4kss\" (UID: \"fc3e2173-5582-4edb-b330-fb46053b22e2\") " pod="metallb-system/frr-k8s-l4kss" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.783410 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/fc3e2173-5582-4edb-b330-fb46053b22e2-reloader\") pod \"frr-k8s-l4kss\" (UID: \"fc3e2173-5582-4edb-b330-fb46053b22e2\") " pod="metallb-system/frr-k8s-l4kss" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.884456 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/fc3e2173-5582-4edb-b330-fb46053b22e2-frr-startup\") pod \"frr-k8s-l4kss\" (UID: \"fc3e2173-5582-4edb-b330-fb46053b22e2\") " pod="metallb-system/frr-k8s-l4kss" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.884501 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/fc3e2173-5582-4edb-b330-fb46053b22e2-reloader\") pod \"frr-k8s-l4kss\" (UID: \"fc3e2173-5582-4edb-b330-fb46053b22e2\") " pod="metallb-system/frr-k8s-l4kss" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.884532 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdhf7\" (UniqueName: \"kubernetes.io/projected/81c4d0e9-644c-4f99-af4b-0d73be068ca2-kube-api-access-rdhf7\") pod \"controller-86ddb6bd46-wf6c2\" (UID: \"81c4d0e9-644c-4f99-af4b-0d73be068ca2\") " pod="metallb-system/controller-86ddb6bd46-wf6c2" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.884560 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9783d250-c2b9-4e29-a8d7-94d92d301478-cert\") pod \"frr-k8s-webhook-server-7f989f654f-6mp9v\" (UID: \"9783d250-c2b9-4e29-a8d7-94d92d301478\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-6mp9v" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.884589 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c-metrics-certs\") pod \"speaker-vlqxq\" (UID: \"7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c\") " pod="metallb-system/speaker-vlqxq" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.884617 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c-memberlist\") pod \"speaker-vlqxq\" (UID: \"7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c\") " pod="metallb-system/speaker-vlqxq" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.884632 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/fc3e2173-5582-4edb-b330-fb46053b22e2-frr-sockets\") pod \"frr-k8s-l4kss\" (UID: \"fc3e2173-5582-4edb-b330-fb46053b22e2\") " pod="metallb-system/frr-k8s-l4kss" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.884663 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81c4d0e9-644c-4f99-af4b-0d73be068ca2-metrics-certs\") pod \"controller-86ddb6bd46-wf6c2\" (UID: \"81c4d0e9-644c-4f99-af4b-0d73be068ca2\") " pod="metallb-system/controller-86ddb6bd46-wf6c2" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.884682 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81c4d0e9-644c-4f99-af4b-0d73be068ca2-cert\") pod \"controller-86ddb6bd46-wf6c2\" (UID: \"81c4d0e9-644c-4f99-af4b-0d73be068ca2\") " pod="metallb-system/controller-86ddb6bd46-wf6c2" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.884698 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w68c\" (UniqueName: \"kubernetes.io/projected/9783d250-c2b9-4e29-a8d7-94d92d301478-kube-api-access-6w68c\") pod \"frr-k8s-webhook-server-7f989f654f-6mp9v\" (UID: \"9783d250-c2b9-4e29-a8d7-94d92d301478\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-6mp9v" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.884713 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c-metallb-excludel2\") pod \"speaker-vlqxq\" (UID: \"7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c\") " pod="metallb-system/speaker-vlqxq" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.884733 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96gp7\" (UniqueName: \"kubernetes.io/projected/7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c-kube-api-access-96gp7\") pod \"speaker-vlqxq\" (UID: \"7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c\") " pod="metallb-system/speaker-vlqxq" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.884748 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/fc3e2173-5582-4edb-b330-fb46053b22e2-metrics\") pod \"frr-k8s-l4kss\" (UID: \"fc3e2173-5582-4edb-b330-fb46053b22e2\") " pod="metallb-system/frr-k8s-l4kss" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.884763 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc3e2173-5582-4edb-b330-fb46053b22e2-metrics-certs\") pod \"frr-k8s-l4kss\" (UID: \"fc3e2173-5582-4edb-b330-fb46053b22e2\") " pod="metallb-system/frr-k8s-l4kss" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.884779 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/fc3e2173-5582-4edb-b330-fb46053b22e2-frr-conf\") pod \"frr-k8s-l4kss\" (UID: \"fc3e2173-5582-4edb-b330-fb46053b22e2\") " pod="metallb-system/frr-k8s-l4kss" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.884804 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfqvq\" (UniqueName: \"kubernetes.io/projected/fc3e2173-5582-4edb-b330-fb46053b22e2-kube-api-access-kfqvq\") pod \"frr-k8s-l4kss\" (UID: \"fc3e2173-5582-4edb-b330-fb46053b22e2\") " pod="metallb-system/frr-k8s-l4kss" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.885024 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/fc3e2173-5582-4edb-b330-fb46053b22e2-reloader\") pod \"frr-k8s-l4kss\" (UID: \"fc3e2173-5582-4edb-b330-fb46053b22e2\") " pod="metallb-system/frr-k8s-l4kss" Feb 28 04:24:23 crc kubenswrapper[5072]: E0228 04:24:23.885393 5072 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 28 04:24:23 crc kubenswrapper[5072]: E0228 04:24:23.885436 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c-memberlist podName:7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c nodeName:}" failed. No retries permitted until 2026-02-28 04:24:24.385422016 +0000 UTC m=+886.380152298 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c-memberlist") pod "speaker-vlqxq" (UID: "7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c") : secret "metallb-memberlist" not found Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.885607 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/fc3e2173-5582-4edb-b330-fb46053b22e2-frr-startup\") pod \"frr-k8s-l4kss\" (UID: \"fc3e2173-5582-4edb-b330-fb46053b22e2\") " pod="metallb-system/frr-k8s-l4kss" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.887359 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/fc3e2173-5582-4edb-b330-fb46053b22e2-metrics\") pod \"frr-k8s-l4kss\" (UID: \"fc3e2173-5582-4edb-b330-fb46053b22e2\") " pod="metallb-system/frr-k8s-l4kss" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.887669 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/fc3e2173-5582-4edb-b330-fb46053b22e2-frr-conf\") pod \"frr-k8s-l4kss\" (UID: \"fc3e2173-5582-4edb-b330-fb46053b22e2\") " pod="metallb-system/frr-k8s-l4kss" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.887938 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/fc3e2173-5582-4edb-b330-fb46053b22e2-frr-sockets\") pod \"frr-k8s-l4kss\" (UID: \"fc3e2173-5582-4edb-b330-fb46053b22e2\") " pod="metallb-system/frr-k8s-l4kss" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.891281 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c-metallb-excludel2\") pod \"speaker-vlqxq\" (UID: \"7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c\") " pod="metallb-system/speaker-vlqxq" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.901169 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9783d250-c2b9-4e29-a8d7-94d92d301478-cert\") pod \"frr-k8s-webhook-server-7f989f654f-6mp9v\" (UID: \"9783d250-c2b9-4e29-a8d7-94d92d301478\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-6mp9v" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.903744 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc3e2173-5582-4edb-b330-fb46053b22e2-metrics-certs\") pod \"frr-k8s-l4kss\" (UID: \"fc3e2173-5582-4edb-b330-fb46053b22e2\") " pod="metallb-system/frr-k8s-l4kss" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.907699 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c-metrics-certs\") pod \"speaker-vlqxq\" (UID: \"7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c\") " pod="metallb-system/speaker-vlqxq" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.917985 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96gp7\" (UniqueName: \"kubernetes.io/projected/7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c-kube-api-access-96gp7\") pod \"speaker-vlqxq\" (UID: \"7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c\") " pod="metallb-system/speaker-vlqxq" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.923726 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfqvq\" (UniqueName: \"kubernetes.io/projected/fc3e2173-5582-4edb-b330-fb46053b22e2-kube-api-access-kfqvq\") pod \"frr-k8s-l4kss\" (UID: \"fc3e2173-5582-4edb-b330-fb46053b22e2\") " pod="metallb-system/frr-k8s-l4kss" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.925704 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w68c\" (UniqueName: \"kubernetes.io/projected/9783d250-c2b9-4e29-a8d7-94d92d301478-kube-api-access-6w68c\") pod \"frr-k8s-webhook-server-7f989f654f-6mp9v\" (UID: \"9783d250-c2b9-4e29-a8d7-94d92d301478\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-6mp9v" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.965809 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-6mp9v" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.977289 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-l4kss" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.993245 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81c4d0e9-644c-4f99-af4b-0d73be068ca2-metrics-certs\") pod \"controller-86ddb6bd46-wf6c2\" (UID: \"81c4d0e9-644c-4f99-af4b-0d73be068ca2\") " pod="metallb-system/controller-86ddb6bd46-wf6c2" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.993287 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81c4d0e9-644c-4f99-af4b-0d73be068ca2-cert\") pod \"controller-86ddb6bd46-wf6c2\" (UID: \"81c4d0e9-644c-4f99-af4b-0d73be068ca2\") " pod="metallb-system/controller-86ddb6bd46-wf6c2" Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.993349 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdhf7\" (UniqueName: \"kubernetes.io/projected/81c4d0e9-644c-4f99-af4b-0d73be068ca2-kube-api-access-rdhf7\") pod \"controller-86ddb6bd46-wf6c2\" (UID: \"81c4d0e9-644c-4f99-af4b-0d73be068ca2\") " pod="metallb-system/controller-86ddb6bd46-wf6c2" Feb 28 04:24:23 crc kubenswrapper[5072]: E0228 04:24:23.993714 5072 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Feb 28 04:24:23 crc kubenswrapper[5072]: E0228 04:24:23.993751 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81c4d0e9-644c-4f99-af4b-0d73be068ca2-metrics-certs podName:81c4d0e9-644c-4f99-af4b-0d73be068ca2 nodeName:}" failed. No retries permitted until 2026-02-28 04:24:24.493738214 +0000 UTC m=+886.488468406 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81c4d0e9-644c-4f99-af4b-0d73be068ca2-metrics-certs") pod "controller-86ddb6bd46-wf6c2" (UID: "81c4d0e9-644c-4f99-af4b-0d73be068ca2") : secret "controller-certs-secret" not found Feb 28 04:24:23 crc kubenswrapper[5072]: I0228 04:24:23.998369 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81c4d0e9-644c-4f99-af4b-0d73be068ca2-cert\") pod \"controller-86ddb6bd46-wf6c2\" (UID: \"81c4d0e9-644c-4f99-af4b-0d73be068ca2\") " pod="metallb-system/controller-86ddb6bd46-wf6c2" Feb 28 04:24:24 crc kubenswrapper[5072]: I0228 04:24:24.025555 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdhf7\" (UniqueName: \"kubernetes.io/projected/81c4d0e9-644c-4f99-af4b-0d73be068ca2-kube-api-access-rdhf7\") pod \"controller-86ddb6bd46-wf6c2\" (UID: \"81c4d0e9-644c-4f99-af4b-0d73be068ca2\") " pod="metallb-system/controller-86ddb6bd46-wf6c2" Feb 28 04:24:24 crc kubenswrapper[5072]: I0228 04:24:24.251135 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-6mp9v"] Feb 28 04:24:24 crc kubenswrapper[5072]: W0228 04:24:24.261673 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9783d250_c2b9_4e29_a8d7_94d92d301478.slice/crio-add7fdd2567f2e869808b635ad46af699a6b638e5140abeb715137529a5e0bd0 WatchSource:0}: Error finding container add7fdd2567f2e869808b635ad46af699a6b638e5140abeb715137529a5e0bd0: Status 404 returned error can't find the container with id add7fdd2567f2e869808b635ad46af699a6b638e5140abeb715137529a5e0bd0 Feb 28 04:24:24 crc kubenswrapper[5072]: I0228 04:24:24.399165 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c-memberlist\") pod \"speaker-vlqxq\" (UID: \"7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c\") " pod="metallb-system/speaker-vlqxq" Feb 28 04:24:24 crc kubenswrapper[5072]: E0228 04:24:24.399356 5072 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 28 04:24:24 crc kubenswrapper[5072]: E0228 04:24:24.399422 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c-memberlist podName:7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c nodeName:}" failed. No retries permitted until 2026-02-28 04:24:25.399406708 +0000 UTC m=+887.394136900 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c-memberlist") pod "speaker-vlqxq" (UID: "7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c") : secret "metallb-memberlist" not found Feb 28 04:24:24 crc kubenswrapper[5072]: I0228 04:24:24.499992 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81c4d0e9-644c-4f99-af4b-0d73be068ca2-metrics-certs\") pod \"controller-86ddb6bd46-wf6c2\" (UID: \"81c4d0e9-644c-4f99-af4b-0d73be068ca2\") " pod="metallb-system/controller-86ddb6bd46-wf6c2" Feb 28 04:24:24 crc kubenswrapper[5072]: I0228 04:24:24.506039 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81c4d0e9-644c-4f99-af4b-0d73be068ca2-metrics-certs\") pod \"controller-86ddb6bd46-wf6c2\" (UID: \"81c4d0e9-644c-4f99-af4b-0d73be068ca2\") " pod="metallb-system/controller-86ddb6bd46-wf6c2" Feb 28 04:24:24 crc kubenswrapper[5072]: I0228 04:24:24.670016 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-wf6c2" Feb 28 04:24:24 crc kubenswrapper[5072]: I0228 04:24:24.868952 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-wf6c2"] Feb 28 04:24:24 crc kubenswrapper[5072]: W0228 04:24:24.871017 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81c4d0e9_644c_4f99_af4b_0d73be068ca2.slice/crio-a98c3f8af4942a185da5026403e37c874e138cc2c40880ae7c5e72e929bbe733 WatchSource:0}: Error finding container a98c3f8af4942a185da5026403e37c874e138cc2c40880ae7c5e72e929bbe733: Status 404 returned error can't find the container with id a98c3f8af4942a185da5026403e37c874e138cc2c40880ae7c5e72e929bbe733 Feb 28 04:24:25 crc kubenswrapper[5072]: I0228 04:24:25.014638 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-wf6c2" event={"ID":"81c4d0e9-644c-4f99-af4b-0d73be068ca2","Type":"ContainerStarted","Data":"97a4fa2ff8227b0a18848e9f5ca9cb8a671443aa4aebfba7edf2b25585abd439"} Feb 28 04:24:25 crc kubenswrapper[5072]: I0228 04:24:25.014762 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-wf6c2" event={"ID":"81c4d0e9-644c-4f99-af4b-0d73be068ca2","Type":"ContainerStarted","Data":"a98c3f8af4942a185da5026403e37c874e138cc2c40880ae7c5e72e929bbe733"} Feb 28 04:24:25 crc kubenswrapper[5072]: I0228 04:24:25.018140 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-6mp9v" event={"ID":"9783d250-c2b9-4e29-a8d7-94d92d301478","Type":"ContainerStarted","Data":"add7fdd2567f2e869808b635ad46af699a6b638e5140abeb715137529a5e0bd0"} Feb 28 04:24:25 crc kubenswrapper[5072]: I0228 04:24:25.019619 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l4kss" event={"ID":"fc3e2173-5582-4edb-b330-fb46053b22e2","Type":"ContainerStarted","Data":"8091621db81762be2c1d87956f3aecd80eb1db1fcb5310f0e274ca705e273b09"} Feb 28 04:24:25 crc kubenswrapper[5072]: I0228 04:24:25.409510 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c-memberlist\") pod \"speaker-vlqxq\" (UID: \"7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c\") " pod="metallb-system/speaker-vlqxq" Feb 28 04:24:25 crc kubenswrapper[5072]: I0228 04:24:25.415971 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c-memberlist\") pod \"speaker-vlqxq\" (UID: \"7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c\") " pod="metallb-system/speaker-vlqxq" Feb 28 04:24:25 crc kubenswrapper[5072]: I0228 04:24:25.545867 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-vlqxq" Feb 28 04:24:25 crc kubenswrapper[5072]: W0228 04:24:25.568895 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a3ec2ec_a764_4325_8c9a_d2a7a9492c8c.slice/crio-f94c247689ca7199defd3a0dab60925291c90e6db3e83f69456a241757ef2352 WatchSource:0}: Error finding container f94c247689ca7199defd3a0dab60925291c90e6db3e83f69456a241757ef2352: Status 404 returned error can't find the container with id f94c247689ca7199defd3a0dab60925291c90e6db3e83f69456a241757ef2352 Feb 28 04:24:26 crc kubenswrapper[5072]: I0228 04:24:26.027095 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vlqxq" event={"ID":"7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c","Type":"ContainerStarted","Data":"e9ea2f1bfae35c3bf5926eb2ca058082d05af31456ccc4cc6f22cc37685cad7d"} Feb 28 04:24:26 crc kubenswrapper[5072]: I0228 04:24:26.027144 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vlqxq" event={"ID":"7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c","Type":"ContainerStarted","Data":"f94c247689ca7199defd3a0dab60925291c90e6db3e83f69456a241757ef2352"} Feb 28 04:24:32 crc kubenswrapper[5072]: I0228 04:24:32.064710 5072 generic.go:334] "Generic (PLEG): container finished" podID="fc3e2173-5582-4edb-b330-fb46053b22e2" containerID="2795b07bb01de33abf93067c6bb5875be3ecda1aefbfa16ee954e683ec255a2b" exitCode=0 Feb 28 04:24:32 crc kubenswrapper[5072]: I0228 04:24:32.064767 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l4kss" event={"ID":"fc3e2173-5582-4edb-b330-fb46053b22e2","Type":"ContainerDied","Data":"2795b07bb01de33abf93067c6bb5875be3ecda1aefbfa16ee954e683ec255a2b"} Feb 28 04:24:32 crc kubenswrapper[5072]: I0228 04:24:32.069986 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vlqxq" event={"ID":"7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c","Type":"ContainerStarted","Data":"fcd0b351f4d887fe2b8cfda7c7114e45ffaf1697d7d0d5e2e69ba62ebb9b2d4b"} Feb 28 04:24:32 crc kubenswrapper[5072]: I0228 04:24:32.070132 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-vlqxq" Feb 28 04:24:32 crc kubenswrapper[5072]: I0228 04:24:32.072916 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-wf6c2" event={"ID":"81c4d0e9-644c-4f99-af4b-0d73be068ca2","Type":"ContainerStarted","Data":"16359921e1ecde7459830da49c25f46b334f64572248862ac6f1b3ce6604bd85"} Feb 28 04:24:32 crc kubenswrapper[5072]: I0228 04:24:32.073447 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-wf6c2" Feb 28 04:24:32 crc kubenswrapper[5072]: I0228 04:24:32.074931 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-6mp9v" event={"ID":"9783d250-c2b9-4e29-a8d7-94d92d301478","Type":"ContainerStarted","Data":"dff5fc0b74c91bc85083fd94e385e958a826fe0c5af62b6f0de9b50a3a911ce6"} Feb 28 04:24:32 crc kubenswrapper[5072]: I0228 04:24:32.075137 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-6mp9v" Feb 28 04:24:32 crc kubenswrapper[5072]: I0228 04:24:32.114043 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-6mp9v" podStartSLOduration=1.749595123 podStartE2EDuration="9.114027236s" podCreationTimestamp="2026-02-28 04:24:23 +0000 UTC" firstStartedPulling="2026-02-28 04:24:24.263717976 +0000 UTC m=+886.258448168" lastFinishedPulling="2026-02-28 04:24:31.628150089 +0000 UTC m=+893.622880281" observedRunningTime="2026-02-28 04:24:32.107986548 +0000 UTC m=+894.102716740" watchObservedRunningTime="2026-02-28 04:24:32.114027236 +0000 UTC m=+894.108757428" Feb 28 04:24:32 crc kubenswrapper[5072]: I0228 04:24:32.125501 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-wf6c2" podStartSLOduration=2.5426358000000002 podStartE2EDuration="9.125480724s" podCreationTimestamp="2026-02-28 04:24:23 +0000 UTC" firstStartedPulling="2026-02-28 04:24:25.010558392 +0000 UTC m=+887.005288584" lastFinishedPulling="2026-02-28 04:24:31.593403316 +0000 UTC m=+893.588133508" observedRunningTime="2026-02-28 04:24:32.12121394 +0000 UTC m=+894.115944142" watchObservedRunningTime="2026-02-28 04:24:32.125480724 +0000 UTC m=+894.120210916" Feb 28 04:24:32 crc kubenswrapper[5072]: I0228 04:24:32.138639 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-vlqxq" podStartSLOduration=3.322075523 podStartE2EDuration="9.138620533s" podCreationTimestamp="2026-02-28 04:24:23 +0000 UTC" firstStartedPulling="2026-02-28 04:24:25.777826016 +0000 UTC m=+887.772556208" lastFinishedPulling="2026-02-28 04:24:31.594371026 +0000 UTC m=+893.589101218" observedRunningTime="2026-02-28 04:24:32.133946598 +0000 UTC m=+894.128676800" watchObservedRunningTime="2026-02-28 04:24:32.138620533 +0000 UTC m=+894.133350715" Feb 28 04:24:33 crc kubenswrapper[5072]: I0228 04:24:33.080096 5072 generic.go:334] "Generic (PLEG): container finished" podID="fc3e2173-5582-4edb-b330-fb46053b22e2" containerID="4260cfc2195cf5517fd4ba5fd90eca403d1e2590500ae2fc0e706526d17b65be" exitCode=0 Feb 28 04:24:33 crc kubenswrapper[5072]: I0228 04:24:33.080197 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l4kss" event={"ID":"fc3e2173-5582-4edb-b330-fb46053b22e2","Type":"ContainerDied","Data":"4260cfc2195cf5517fd4ba5fd90eca403d1e2590500ae2fc0e706526d17b65be"} Feb 28 04:24:34 crc kubenswrapper[5072]: I0228 04:24:34.089907 5072 generic.go:334] "Generic (PLEG): container finished" podID="fc3e2173-5582-4edb-b330-fb46053b22e2" containerID="7b36860e48cf8202b7c1e4b2cfcd95ae846e60df20adcd5c2dad70c43fce6cee" exitCode=0 Feb 28 04:24:34 crc kubenswrapper[5072]: I0228 04:24:34.090022 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l4kss" event={"ID":"fc3e2173-5582-4edb-b330-fb46053b22e2","Type":"ContainerDied","Data":"7b36860e48cf8202b7c1e4b2cfcd95ae846e60df20adcd5c2dad70c43fce6cee"} Feb 28 04:24:35 crc kubenswrapper[5072]: I0228 04:24:35.101685 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l4kss" event={"ID":"fc3e2173-5582-4edb-b330-fb46053b22e2","Type":"ContainerStarted","Data":"f9ea8a3f402ad815965dcbf888f9ef224a3bfdd4a022b31bbaf640d47c0f4746"} Feb 28 04:24:35 crc kubenswrapper[5072]: I0228 04:24:35.101964 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-l4kss" Feb 28 04:24:35 crc kubenswrapper[5072]: I0228 04:24:35.101979 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l4kss" event={"ID":"fc3e2173-5582-4edb-b330-fb46053b22e2","Type":"ContainerStarted","Data":"8794b15459063914d66a5eac477a55464574bc0566e6e53fc35ef7ce9001c53a"} Feb 28 04:24:35 crc kubenswrapper[5072]: I0228 04:24:35.101991 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l4kss" event={"ID":"fc3e2173-5582-4edb-b330-fb46053b22e2","Type":"ContainerStarted","Data":"08d046405d31071bfb044999326970cf5d7e56ba02b95eeb54d409a051b04f33"} Feb 28 04:24:35 crc kubenswrapper[5072]: I0228 04:24:35.102001 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l4kss" event={"ID":"fc3e2173-5582-4edb-b330-fb46053b22e2","Type":"ContainerStarted","Data":"da03456e4122199c7c06005b6d9adddbc8bb6cd6fb34dcbfb2c0c35864eab8d6"} Feb 28 04:24:35 crc kubenswrapper[5072]: I0228 04:24:35.102011 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l4kss" event={"ID":"fc3e2173-5582-4edb-b330-fb46053b22e2","Type":"ContainerStarted","Data":"da95f31830c8e427fd3d1271bdf15f85c6f1d99966c3d724e660a067accafccd"} Feb 28 04:24:35 crc kubenswrapper[5072]: I0228 04:24:35.102021 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l4kss" event={"ID":"fc3e2173-5582-4edb-b330-fb46053b22e2","Type":"ContainerStarted","Data":"f49a389545ef25c798f72d96d751a794a4b84ff625948510eb3ff20401cd9768"} Feb 28 04:24:35 crc kubenswrapper[5072]: I0228 04:24:35.133340 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-l4kss" podStartSLOduration=4.698699776 podStartE2EDuration="12.133321759s" podCreationTimestamp="2026-02-28 04:24:23 +0000 UTC" firstStartedPulling="2026-02-28 04:24:24.163236002 +0000 UTC m=+886.157966194" lastFinishedPulling="2026-02-28 04:24:31.597857985 +0000 UTC m=+893.592588177" observedRunningTime="2026-02-28 04:24:35.128130917 +0000 UTC m=+897.122861149" watchObservedRunningTime="2026-02-28 04:24:35.133321759 +0000 UTC m=+897.128051961" Feb 28 04:24:35 crc kubenswrapper[5072]: I0228 04:24:35.557531 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-vlqxq" Feb 28 04:24:38 crc kubenswrapper[5072]: I0228 04:24:38.978507 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-l4kss" Feb 28 04:24:39 crc kubenswrapper[5072]: I0228 04:24:39.014403 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-l4kss" Feb 28 04:24:43 crc kubenswrapper[5072]: I0228 04:24:43.970551 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-6mp9v" Feb 28 04:24:44 crc kubenswrapper[5072]: I0228 04:24:44.458800 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-96g9g"] Feb 28 04:24:44 crc kubenswrapper[5072]: I0228 04:24:44.459745 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-96g9g" Feb 28 04:24:44 crc kubenswrapper[5072]: I0228 04:24:44.468170 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 28 04:24:44 crc kubenswrapper[5072]: I0228 04:24:44.468270 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-zs2f8" Feb 28 04:24:44 crc kubenswrapper[5072]: I0228 04:24:44.468267 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 28 04:24:44 crc kubenswrapper[5072]: I0228 04:24:44.475861 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-96g9g"] Feb 28 04:24:44 crc kubenswrapper[5072]: I0228 04:24:44.621450 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-655g8\" (UniqueName: \"kubernetes.io/projected/1ff70ba5-9ddf-46ff-91b8-84c9e1834ad5-kube-api-access-655g8\") pod \"mariadb-operator-index-96g9g\" (UID: \"1ff70ba5-9ddf-46ff-91b8-84c9e1834ad5\") " pod="openstack-operators/mariadb-operator-index-96g9g" Feb 28 04:24:44 crc kubenswrapper[5072]: I0228 04:24:44.675764 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-wf6c2" Feb 28 04:24:44 crc kubenswrapper[5072]: I0228 04:24:44.723276 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-655g8\" (UniqueName: \"kubernetes.io/projected/1ff70ba5-9ddf-46ff-91b8-84c9e1834ad5-kube-api-access-655g8\") pod \"mariadb-operator-index-96g9g\" (UID: \"1ff70ba5-9ddf-46ff-91b8-84c9e1834ad5\") " pod="openstack-operators/mariadb-operator-index-96g9g" Feb 28 04:24:44 crc kubenswrapper[5072]: I0228 04:24:44.747209 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-655g8\" (UniqueName: \"kubernetes.io/projected/1ff70ba5-9ddf-46ff-91b8-84c9e1834ad5-kube-api-access-655g8\") pod \"mariadb-operator-index-96g9g\" (UID: \"1ff70ba5-9ddf-46ff-91b8-84c9e1834ad5\") " pod="openstack-operators/mariadb-operator-index-96g9g" Feb 28 04:24:44 crc kubenswrapper[5072]: I0228 04:24:44.780133 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-96g9g" Feb 28 04:24:45 crc kubenswrapper[5072]: I0228 04:24:45.002908 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-96g9g"] Feb 28 04:24:45 crc kubenswrapper[5072]: W0228 04:24:45.012811 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ff70ba5_9ddf_46ff_91b8_84c9e1834ad5.slice/crio-6d41409919b04420484777c7da6cb930834133af538b82370cb98c329ed09ee4 WatchSource:0}: Error finding container 6d41409919b04420484777c7da6cb930834133af538b82370cb98c329ed09ee4: Status 404 returned error can't find the container with id 6d41409919b04420484777c7da6cb930834133af538b82370cb98c329ed09ee4 Feb 28 04:24:45 crc kubenswrapper[5072]: I0228 04:24:45.155267 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-96g9g" event={"ID":"1ff70ba5-9ddf-46ff-91b8-84c9e1834ad5","Type":"ContainerStarted","Data":"6d41409919b04420484777c7da6cb930834133af538b82370cb98c329ed09ee4"} Feb 28 04:24:47 crc kubenswrapper[5072]: I0228 04:24:47.166830 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-96g9g" event={"ID":"1ff70ba5-9ddf-46ff-91b8-84c9e1834ad5","Type":"ContainerStarted","Data":"eec3f3d7457bb73681832c63bbb5abce738c4d693c02c779191b341a083cab0f"} Feb 28 04:24:47 crc kubenswrapper[5072]: I0228 04:24:47.179151 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-96g9g" podStartSLOduration=1.409938546 podStartE2EDuration="3.179138844s" podCreationTimestamp="2026-02-28 04:24:44 +0000 UTC" firstStartedPulling="2026-02-28 04:24:45.014292443 +0000 UTC m=+907.009022636" lastFinishedPulling="2026-02-28 04:24:46.783492742 +0000 UTC m=+908.778222934" observedRunningTime="2026-02-28 04:24:47.178109731 +0000 UTC m=+909.172839923" watchObservedRunningTime="2026-02-28 04:24:47.179138844 +0000 UTC m=+909.173869036" Feb 28 04:24:48 crc kubenswrapper[5072]: I0228 04:24:48.453466 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-96g9g"] Feb 28 04:24:48 crc kubenswrapper[5072]: I0228 04:24:48.854528 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-vqstk"] Feb 28 04:24:48 crc kubenswrapper[5072]: I0228 04:24:48.855292 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-vqstk" Feb 28 04:24:48 crc kubenswrapper[5072]: I0228 04:24:48.862497 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-vqstk"] Feb 28 04:24:48 crc kubenswrapper[5072]: I0228 04:24:48.872283 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6tww\" (UniqueName: \"kubernetes.io/projected/aea9870e-2b10-4d66-b478-023e2aed2ced-kube-api-access-k6tww\") pod \"mariadb-operator-index-vqstk\" (UID: \"aea9870e-2b10-4d66-b478-023e2aed2ced\") " pod="openstack-operators/mariadb-operator-index-vqstk" Feb 28 04:24:48 crc kubenswrapper[5072]: I0228 04:24:48.973742 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6tww\" (UniqueName: \"kubernetes.io/projected/aea9870e-2b10-4d66-b478-023e2aed2ced-kube-api-access-k6tww\") pod \"mariadb-operator-index-vqstk\" (UID: \"aea9870e-2b10-4d66-b478-023e2aed2ced\") " pod="openstack-operators/mariadb-operator-index-vqstk" Feb 28 04:24:48 crc kubenswrapper[5072]: I0228 04:24:48.993220 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6tww\" (UniqueName: \"kubernetes.io/projected/aea9870e-2b10-4d66-b478-023e2aed2ced-kube-api-access-k6tww\") pod \"mariadb-operator-index-vqstk\" (UID: \"aea9870e-2b10-4d66-b478-023e2aed2ced\") " pod="openstack-operators/mariadb-operator-index-vqstk" Feb 28 04:24:49 crc kubenswrapper[5072]: I0228 04:24:49.178816 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-vqstk" Feb 28 04:24:49 crc kubenswrapper[5072]: I0228 04:24:49.178958 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-96g9g" podUID="1ff70ba5-9ddf-46ff-91b8-84c9e1834ad5" containerName="registry-server" containerID="cri-o://eec3f3d7457bb73681832c63bbb5abce738c4d693c02c779191b341a083cab0f" gracePeriod=2 Feb 28 04:24:49 crc kubenswrapper[5072]: I0228 04:24:49.536622 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-96g9g" Feb 28 04:24:49 crc kubenswrapper[5072]: I0228 04:24:49.587309 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-655g8\" (UniqueName: \"kubernetes.io/projected/1ff70ba5-9ddf-46ff-91b8-84c9e1834ad5-kube-api-access-655g8\") pod \"1ff70ba5-9ddf-46ff-91b8-84c9e1834ad5\" (UID: \"1ff70ba5-9ddf-46ff-91b8-84c9e1834ad5\") " Feb 28 04:24:49 crc kubenswrapper[5072]: I0228 04:24:49.593310 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ff70ba5-9ddf-46ff-91b8-84c9e1834ad5-kube-api-access-655g8" (OuterVolumeSpecName: "kube-api-access-655g8") pod "1ff70ba5-9ddf-46ff-91b8-84c9e1834ad5" (UID: "1ff70ba5-9ddf-46ff-91b8-84c9e1834ad5"). InnerVolumeSpecName "kube-api-access-655g8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:24:49 crc kubenswrapper[5072]: I0228 04:24:49.620427 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-vqstk"] Feb 28 04:24:49 crc kubenswrapper[5072]: W0228 04:24:49.634216 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaea9870e_2b10_4d66_b478_023e2aed2ced.slice/crio-9e3ba8aaa01e28888ed953fd41037459e7a3155edd2c94e229562584b851d4f0 WatchSource:0}: Error finding container 9e3ba8aaa01e28888ed953fd41037459e7a3155edd2c94e229562584b851d4f0: Status 404 returned error can't find the container with id 9e3ba8aaa01e28888ed953fd41037459e7a3155edd2c94e229562584b851d4f0 Feb 28 04:24:49 crc kubenswrapper[5072]: I0228 04:24:49.688727 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-655g8\" (UniqueName: \"kubernetes.io/projected/1ff70ba5-9ddf-46ff-91b8-84c9e1834ad5-kube-api-access-655g8\") on node \"crc\" DevicePath \"\"" Feb 28 04:24:50 crc kubenswrapper[5072]: I0228 04:24:50.184242 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-vqstk" event={"ID":"aea9870e-2b10-4d66-b478-023e2aed2ced","Type":"ContainerStarted","Data":"9e3ba8aaa01e28888ed953fd41037459e7a3155edd2c94e229562584b851d4f0"} Feb 28 04:24:50 crc kubenswrapper[5072]: I0228 04:24:50.185675 5072 generic.go:334] "Generic (PLEG): container finished" podID="1ff70ba5-9ddf-46ff-91b8-84c9e1834ad5" containerID="eec3f3d7457bb73681832c63bbb5abce738c4d693c02c779191b341a083cab0f" exitCode=0 Feb 28 04:24:50 crc kubenswrapper[5072]: I0228 04:24:50.185715 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-96g9g" event={"ID":"1ff70ba5-9ddf-46ff-91b8-84c9e1834ad5","Type":"ContainerDied","Data":"eec3f3d7457bb73681832c63bbb5abce738c4d693c02c779191b341a083cab0f"} Feb 28 04:24:50 crc kubenswrapper[5072]: I0228 04:24:50.185743 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-96g9g" event={"ID":"1ff70ba5-9ddf-46ff-91b8-84c9e1834ad5","Type":"ContainerDied","Data":"6d41409919b04420484777c7da6cb930834133af538b82370cb98c329ed09ee4"} Feb 28 04:24:50 crc kubenswrapper[5072]: I0228 04:24:50.185761 5072 scope.go:117] "RemoveContainer" containerID="eec3f3d7457bb73681832c63bbb5abce738c4d693c02c779191b341a083cab0f" Feb 28 04:24:50 crc kubenswrapper[5072]: I0228 04:24:50.185793 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-96g9g" Feb 28 04:24:50 crc kubenswrapper[5072]: I0228 04:24:50.202294 5072 scope.go:117] "RemoveContainer" containerID="eec3f3d7457bb73681832c63bbb5abce738c4d693c02c779191b341a083cab0f" Feb 28 04:24:50 crc kubenswrapper[5072]: E0228 04:24:50.202874 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eec3f3d7457bb73681832c63bbb5abce738c4d693c02c779191b341a083cab0f\": container with ID starting with eec3f3d7457bb73681832c63bbb5abce738c4d693c02c779191b341a083cab0f not found: ID does not exist" containerID="eec3f3d7457bb73681832c63bbb5abce738c4d693c02c779191b341a083cab0f" Feb 28 04:24:50 crc kubenswrapper[5072]: I0228 04:24:50.202902 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eec3f3d7457bb73681832c63bbb5abce738c4d693c02c779191b341a083cab0f"} err="failed to get container status \"eec3f3d7457bb73681832c63bbb5abce738c4d693c02c779191b341a083cab0f\": rpc error: code = NotFound desc = could not find container \"eec3f3d7457bb73681832c63bbb5abce738c4d693c02c779191b341a083cab0f\": container with ID starting with eec3f3d7457bb73681832c63bbb5abce738c4d693c02c779191b341a083cab0f not found: ID does not exist" Feb 28 04:24:50 crc kubenswrapper[5072]: I0228 04:24:50.223358 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-96g9g"] Feb 28 04:24:50 crc kubenswrapper[5072]: I0228 04:24:50.226700 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-96g9g"] Feb 28 04:24:50 crc kubenswrapper[5072]: I0228 04:24:50.665337 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ff70ba5-9ddf-46ff-91b8-84c9e1834ad5" path="/var/lib/kubelet/pods/1ff70ba5-9ddf-46ff-91b8-84c9e1834ad5/volumes" Feb 28 04:24:51 crc kubenswrapper[5072]: I0228 04:24:51.192327 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-vqstk" event={"ID":"aea9870e-2b10-4d66-b478-023e2aed2ced","Type":"ContainerStarted","Data":"131e52691f446d27a9a6d41517e95e12876a80120d0a01bf76cce155af739000"} Feb 28 04:24:51 crc kubenswrapper[5072]: I0228 04:24:51.210061 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-vqstk" podStartSLOduration=2.731455673 podStartE2EDuration="3.210039622s" podCreationTimestamp="2026-02-28 04:24:48 +0000 UTC" firstStartedPulling="2026-02-28 04:24:49.638414047 +0000 UTC m=+911.633144239" lastFinishedPulling="2026-02-28 04:24:50.116997996 +0000 UTC m=+912.111728188" observedRunningTime="2026-02-28 04:24:51.205706407 +0000 UTC m=+913.200436599" watchObservedRunningTime="2026-02-28 04:24:51.210039622 +0000 UTC m=+913.204769814" Feb 28 04:24:53 crc kubenswrapper[5072]: I0228 04:24:53.980547 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-l4kss" Feb 28 04:24:59 crc kubenswrapper[5072]: I0228 04:24:59.179570 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-vqstk" Feb 28 04:24:59 crc kubenswrapper[5072]: I0228 04:24:59.180602 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-vqstk" Feb 28 04:24:59 crc kubenswrapper[5072]: I0228 04:24:59.214483 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-vqstk" Feb 28 04:24:59 crc kubenswrapper[5072]: I0228 04:24:59.276505 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-vqstk" Feb 28 04:25:01 crc kubenswrapper[5072]: I0228 04:25:01.309576 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx"] Feb 28 04:25:01 crc kubenswrapper[5072]: E0228 04:25:01.309920 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ff70ba5-9ddf-46ff-91b8-84c9e1834ad5" containerName="registry-server" Feb 28 04:25:01 crc kubenswrapper[5072]: I0228 04:25:01.309943 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ff70ba5-9ddf-46ff-91b8-84c9e1834ad5" containerName="registry-server" Feb 28 04:25:01 crc kubenswrapper[5072]: I0228 04:25:01.310349 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ff70ba5-9ddf-46ff-91b8-84c9e1834ad5" containerName="registry-server" Feb 28 04:25:01 crc kubenswrapper[5072]: I0228 04:25:01.311868 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx" Feb 28 04:25:01 crc kubenswrapper[5072]: I0228 04:25:01.313758 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-56tr7" Feb 28 04:25:01 crc kubenswrapper[5072]: I0228 04:25:01.317918 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx"] Feb 28 04:25:01 crc kubenswrapper[5072]: I0228 04:25:01.449242 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5777f06f-eab2-41eb-8b38-f1255369da51-util\") pod \"210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx\" (UID: \"5777f06f-eab2-41eb-8b38-f1255369da51\") " pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx" Feb 28 04:25:01 crc kubenswrapper[5072]: I0228 04:25:01.449509 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbg8h\" (UniqueName: \"kubernetes.io/projected/5777f06f-eab2-41eb-8b38-f1255369da51-kube-api-access-mbg8h\") pod \"210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx\" (UID: \"5777f06f-eab2-41eb-8b38-f1255369da51\") " pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx" Feb 28 04:25:01 crc kubenswrapper[5072]: I0228 04:25:01.449708 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5777f06f-eab2-41eb-8b38-f1255369da51-bundle\") pod \"210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx\" (UID: \"5777f06f-eab2-41eb-8b38-f1255369da51\") " pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx" Feb 28 04:25:01 crc kubenswrapper[5072]: I0228 04:25:01.551900 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5777f06f-eab2-41eb-8b38-f1255369da51-bundle\") pod \"210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx\" (UID: \"5777f06f-eab2-41eb-8b38-f1255369da51\") " pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx" Feb 28 04:25:01 crc kubenswrapper[5072]: I0228 04:25:01.552222 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5777f06f-eab2-41eb-8b38-f1255369da51-util\") pod \"210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx\" (UID: \"5777f06f-eab2-41eb-8b38-f1255369da51\") " pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx" Feb 28 04:25:01 crc kubenswrapper[5072]: I0228 04:25:01.552339 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbg8h\" (UniqueName: \"kubernetes.io/projected/5777f06f-eab2-41eb-8b38-f1255369da51-kube-api-access-mbg8h\") pod \"210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx\" (UID: \"5777f06f-eab2-41eb-8b38-f1255369da51\") " pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx" Feb 28 04:25:01 crc kubenswrapper[5072]: I0228 04:25:01.552817 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5777f06f-eab2-41eb-8b38-f1255369da51-util\") pod \"210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx\" (UID: \"5777f06f-eab2-41eb-8b38-f1255369da51\") " pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx" Feb 28 04:25:01 crc kubenswrapper[5072]: I0228 04:25:01.552953 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5777f06f-eab2-41eb-8b38-f1255369da51-bundle\") pod \"210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx\" (UID: \"5777f06f-eab2-41eb-8b38-f1255369da51\") " pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx" Feb 28 04:25:01 crc kubenswrapper[5072]: I0228 04:25:01.570972 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbg8h\" (UniqueName: \"kubernetes.io/projected/5777f06f-eab2-41eb-8b38-f1255369da51-kube-api-access-mbg8h\") pod \"210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx\" (UID: \"5777f06f-eab2-41eb-8b38-f1255369da51\") " pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx" Feb 28 04:25:01 crc kubenswrapper[5072]: I0228 04:25:01.658297 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx" Feb 28 04:25:02 crc kubenswrapper[5072]: I0228 04:25:02.063003 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx"] Feb 28 04:25:02 crc kubenswrapper[5072]: I0228 04:25:02.267582 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx" event={"ID":"5777f06f-eab2-41eb-8b38-f1255369da51","Type":"ContainerStarted","Data":"653b0d24fa4ea520daca0f2d7978213ecb3dc0d0059891612c42ddb47c704811"} Feb 28 04:25:02 crc kubenswrapper[5072]: I0228 04:25:02.267703 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx" event={"ID":"5777f06f-eab2-41eb-8b38-f1255369da51","Type":"ContainerStarted","Data":"fda8fc6b3372de02ec67ef16618dce6d68b1dbc3afcf5c3864b7dd130e917cd2"} Feb 28 04:25:03 crc kubenswrapper[5072]: I0228 04:25:03.275205 5072 generic.go:334] "Generic (PLEG): container finished" podID="5777f06f-eab2-41eb-8b38-f1255369da51" containerID="653b0d24fa4ea520daca0f2d7978213ecb3dc0d0059891612c42ddb47c704811" exitCode=0 Feb 28 04:25:03 crc kubenswrapper[5072]: I0228 04:25:03.275250 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx" event={"ID":"5777f06f-eab2-41eb-8b38-f1255369da51","Type":"ContainerDied","Data":"653b0d24fa4ea520daca0f2d7978213ecb3dc0d0059891612c42ddb47c704811"} Feb 28 04:25:03 crc kubenswrapper[5072]: I0228 04:25:03.277422 5072 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 28 04:25:04 crc kubenswrapper[5072]: I0228 04:25:04.291548 5072 generic.go:334] "Generic (PLEG): container finished" podID="5777f06f-eab2-41eb-8b38-f1255369da51" containerID="b83bd763adc7d84d3c0b3a25ef9c56636923f98d8c37e9276914373942cd8520" exitCode=0 Feb 28 04:25:04 crc kubenswrapper[5072]: I0228 04:25:04.291900 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx" event={"ID":"5777f06f-eab2-41eb-8b38-f1255369da51","Type":"ContainerDied","Data":"b83bd763adc7d84d3c0b3a25ef9c56636923f98d8c37e9276914373942cd8520"} Feb 28 04:25:05 crc kubenswrapper[5072]: I0228 04:25:05.318912 5072 generic.go:334] "Generic (PLEG): container finished" podID="5777f06f-eab2-41eb-8b38-f1255369da51" containerID="d5241d62112b26e63145d089dcdd4fcfaf4bbc5a583c5ae7158f1de89c74d6c0" exitCode=0 Feb 28 04:25:05 crc kubenswrapper[5072]: I0228 04:25:05.319069 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx" event={"ID":"5777f06f-eab2-41eb-8b38-f1255369da51","Type":"ContainerDied","Data":"d5241d62112b26e63145d089dcdd4fcfaf4bbc5a583c5ae7158f1de89c74d6c0"} Feb 28 04:25:06 crc kubenswrapper[5072]: I0228 04:25:06.567031 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx" Feb 28 04:25:06 crc kubenswrapper[5072]: I0228 04:25:06.630750 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5777f06f-eab2-41eb-8b38-f1255369da51-bundle\") pod \"5777f06f-eab2-41eb-8b38-f1255369da51\" (UID: \"5777f06f-eab2-41eb-8b38-f1255369da51\") " Feb 28 04:25:06 crc kubenswrapper[5072]: I0228 04:25:06.630814 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbg8h\" (UniqueName: \"kubernetes.io/projected/5777f06f-eab2-41eb-8b38-f1255369da51-kube-api-access-mbg8h\") pod \"5777f06f-eab2-41eb-8b38-f1255369da51\" (UID: \"5777f06f-eab2-41eb-8b38-f1255369da51\") " Feb 28 04:25:06 crc kubenswrapper[5072]: I0228 04:25:06.630843 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5777f06f-eab2-41eb-8b38-f1255369da51-util\") pod \"5777f06f-eab2-41eb-8b38-f1255369da51\" (UID: \"5777f06f-eab2-41eb-8b38-f1255369da51\") " Feb 28 04:25:06 crc kubenswrapper[5072]: I0228 04:25:06.631746 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5777f06f-eab2-41eb-8b38-f1255369da51-bundle" (OuterVolumeSpecName: "bundle") pod "5777f06f-eab2-41eb-8b38-f1255369da51" (UID: "5777f06f-eab2-41eb-8b38-f1255369da51"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:25:06 crc kubenswrapper[5072]: I0228 04:25:06.638308 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5777f06f-eab2-41eb-8b38-f1255369da51-kube-api-access-mbg8h" (OuterVolumeSpecName: "kube-api-access-mbg8h") pod "5777f06f-eab2-41eb-8b38-f1255369da51" (UID: "5777f06f-eab2-41eb-8b38-f1255369da51"). InnerVolumeSpecName "kube-api-access-mbg8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:25:06 crc kubenswrapper[5072]: I0228 04:25:06.646568 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5777f06f-eab2-41eb-8b38-f1255369da51-util" (OuterVolumeSpecName: "util") pod "5777f06f-eab2-41eb-8b38-f1255369da51" (UID: "5777f06f-eab2-41eb-8b38-f1255369da51"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:25:06 crc kubenswrapper[5072]: I0228 04:25:06.731522 5072 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5777f06f-eab2-41eb-8b38-f1255369da51-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 04:25:06 crc kubenswrapper[5072]: I0228 04:25:06.731888 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbg8h\" (UniqueName: \"kubernetes.io/projected/5777f06f-eab2-41eb-8b38-f1255369da51-kube-api-access-mbg8h\") on node \"crc\" DevicePath \"\"" Feb 28 04:25:06 crc kubenswrapper[5072]: I0228 04:25:06.731904 5072 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5777f06f-eab2-41eb-8b38-f1255369da51-util\") on node \"crc\" DevicePath \"\"" Feb 28 04:25:07 crc kubenswrapper[5072]: I0228 04:25:07.332288 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx" event={"ID":"5777f06f-eab2-41eb-8b38-f1255369da51","Type":"ContainerDied","Data":"fda8fc6b3372de02ec67ef16618dce6d68b1dbc3afcf5c3864b7dd130e917cd2"} Feb 28 04:25:07 crc kubenswrapper[5072]: I0228 04:25:07.332325 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fda8fc6b3372de02ec67ef16618dce6d68b1dbc3afcf5c3864b7dd130e917cd2" Feb 28 04:25:07 crc kubenswrapper[5072]: I0228 04:25:07.332333 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx" Feb 28 04:25:14 crc kubenswrapper[5072]: I0228 04:25:14.523604 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56f56f4fcc-2h9bj"] Feb 28 04:25:14 crc kubenswrapper[5072]: E0228 04:25:14.525266 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5777f06f-eab2-41eb-8b38-f1255369da51" containerName="extract" Feb 28 04:25:14 crc kubenswrapper[5072]: I0228 04:25:14.525376 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="5777f06f-eab2-41eb-8b38-f1255369da51" containerName="extract" Feb 28 04:25:14 crc kubenswrapper[5072]: E0228 04:25:14.525446 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5777f06f-eab2-41eb-8b38-f1255369da51" containerName="pull" Feb 28 04:25:14 crc kubenswrapper[5072]: I0228 04:25:14.525512 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="5777f06f-eab2-41eb-8b38-f1255369da51" containerName="pull" Feb 28 04:25:14 crc kubenswrapper[5072]: E0228 04:25:14.525579 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5777f06f-eab2-41eb-8b38-f1255369da51" containerName="util" Feb 28 04:25:14 crc kubenswrapper[5072]: I0228 04:25:14.525849 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="5777f06f-eab2-41eb-8b38-f1255369da51" containerName="util" Feb 28 04:25:14 crc kubenswrapper[5072]: I0228 04:25:14.526013 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="5777f06f-eab2-41eb-8b38-f1255369da51" containerName="extract" Feb 28 04:25:14 crc kubenswrapper[5072]: I0228 04:25:14.526458 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56f56f4fcc-2h9bj" Feb 28 04:25:14 crc kubenswrapper[5072]: I0228 04:25:14.528228 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 28 04:25:14 crc kubenswrapper[5072]: I0228 04:25:14.528258 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Feb 28 04:25:14 crc kubenswrapper[5072]: I0228 04:25:14.534220 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-v2hp6" Feb 28 04:25:14 crc kubenswrapper[5072]: I0228 04:25:14.543518 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56f56f4fcc-2h9bj"] Feb 28 04:25:14 crc kubenswrapper[5072]: I0228 04:25:14.733433 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f3f522d3-98e9-446c-985e-01fbeb36f25d-webhook-cert\") pod \"mariadb-operator-controller-manager-56f56f4fcc-2h9bj\" (UID: \"f3f522d3-98e9-446c-985e-01fbeb36f25d\") " pod="openstack-operators/mariadb-operator-controller-manager-56f56f4fcc-2h9bj" Feb 28 04:25:14 crc kubenswrapper[5072]: I0228 04:25:14.733538 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjzh5\" (UniqueName: \"kubernetes.io/projected/f3f522d3-98e9-446c-985e-01fbeb36f25d-kube-api-access-pjzh5\") pod \"mariadb-operator-controller-manager-56f56f4fcc-2h9bj\" (UID: \"f3f522d3-98e9-446c-985e-01fbeb36f25d\") " pod="openstack-operators/mariadb-operator-controller-manager-56f56f4fcc-2h9bj" Feb 28 04:25:14 crc kubenswrapper[5072]: I0228 04:25:14.734548 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f3f522d3-98e9-446c-985e-01fbeb36f25d-apiservice-cert\") pod \"mariadb-operator-controller-manager-56f56f4fcc-2h9bj\" (UID: \"f3f522d3-98e9-446c-985e-01fbeb36f25d\") " pod="openstack-operators/mariadb-operator-controller-manager-56f56f4fcc-2h9bj" Feb 28 04:25:14 crc kubenswrapper[5072]: I0228 04:25:14.835799 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f3f522d3-98e9-446c-985e-01fbeb36f25d-apiservice-cert\") pod \"mariadb-operator-controller-manager-56f56f4fcc-2h9bj\" (UID: \"f3f522d3-98e9-446c-985e-01fbeb36f25d\") " pod="openstack-operators/mariadb-operator-controller-manager-56f56f4fcc-2h9bj" Feb 28 04:25:14 crc kubenswrapper[5072]: I0228 04:25:14.835868 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f3f522d3-98e9-446c-985e-01fbeb36f25d-webhook-cert\") pod \"mariadb-operator-controller-manager-56f56f4fcc-2h9bj\" (UID: \"f3f522d3-98e9-446c-985e-01fbeb36f25d\") " pod="openstack-operators/mariadb-operator-controller-manager-56f56f4fcc-2h9bj" Feb 28 04:25:14 crc kubenswrapper[5072]: I0228 04:25:14.835890 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjzh5\" (UniqueName: \"kubernetes.io/projected/f3f522d3-98e9-446c-985e-01fbeb36f25d-kube-api-access-pjzh5\") pod \"mariadb-operator-controller-manager-56f56f4fcc-2h9bj\" (UID: \"f3f522d3-98e9-446c-985e-01fbeb36f25d\") " pod="openstack-operators/mariadb-operator-controller-manager-56f56f4fcc-2h9bj" Feb 28 04:25:14 crc kubenswrapper[5072]: I0228 04:25:14.841065 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f3f522d3-98e9-446c-985e-01fbeb36f25d-apiservice-cert\") pod \"mariadb-operator-controller-manager-56f56f4fcc-2h9bj\" (UID: \"f3f522d3-98e9-446c-985e-01fbeb36f25d\") " pod="openstack-operators/mariadb-operator-controller-manager-56f56f4fcc-2h9bj" Feb 28 04:25:14 crc kubenswrapper[5072]: I0228 04:25:14.841241 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f3f522d3-98e9-446c-985e-01fbeb36f25d-webhook-cert\") pod \"mariadb-operator-controller-manager-56f56f4fcc-2h9bj\" (UID: \"f3f522d3-98e9-446c-985e-01fbeb36f25d\") " pod="openstack-operators/mariadb-operator-controller-manager-56f56f4fcc-2h9bj" Feb 28 04:25:14 crc kubenswrapper[5072]: I0228 04:25:14.851355 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjzh5\" (UniqueName: \"kubernetes.io/projected/f3f522d3-98e9-446c-985e-01fbeb36f25d-kube-api-access-pjzh5\") pod \"mariadb-operator-controller-manager-56f56f4fcc-2h9bj\" (UID: \"f3f522d3-98e9-446c-985e-01fbeb36f25d\") " pod="openstack-operators/mariadb-operator-controller-manager-56f56f4fcc-2h9bj" Feb 28 04:25:15 crc kubenswrapper[5072]: I0228 04:25:15.144909 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56f56f4fcc-2h9bj" Feb 28 04:25:15 crc kubenswrapper[5072]: I0228 04:25:15.531058 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56f56f4fcc-2h9bj"] Feb 28 04:25:16 crc kubenswrapper[5072]: I0228 04:25:16.400146 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56f56f4fcc-2h9bj" event={"ID":"f3f522d3-98e9-446c-985e-01fbeb36f25d","Type":"ContainerStarted","Data":"54d4311087c4b96ff26f0518495676996fc50296f80594220bed2ec2096fed08"} Feb 28 04:25:19 crc kubenswrapper[5072]: I0228 04:25:19.419174 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56f56f4fcc-2h9bj" event={"ID":"f3f522d3-98e9-446c-985e-01fbeb36f25d","Type":"ContainerStarted","Data":"9911e216c4f0525d9dd024a0158c92fa17cd88d2d765c7bfc3453693f6524216"} Feb 28 04:25:19 crc kubenswrapper[5072]: I0228 04:25:19.419743 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56f56f4fcc-2h9bj" Feb 28 04:25:19 crc kubenswrapper[5072]: I0228 04:25:19.438168 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56f56f4fcc-2h9bj" podStartSLOduration=2.201659178 podStartE2EDuration="5.438144337s" podCreationTimestamp="2026-02-28 04:25:14 +0000 UTC" firstStartedPulling="2026-02-28 04:25:15.537459779 +0000 UTC m=+937.532189971" lastFinishedPulling="2026-02-28 04:25:18.773944938 +0000 UTC m=+940.768675130" observedRunningTime="2026-02-28 04:25:19.43281329 +0000 UTC m=+941.427543502" watchObservedRunningTime="2026-02-28 04:25:19.438144337 +0000 UTC m=+941.432874529" Feb 28 04:25:25 crc kubenswrapper[5072]: I0228 04:25:25.150230 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56f56f4fcc-2h9bj" Feb 28 04:25:27 crc kubenswrapper[5072]: I0228 04:25:27.370848 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wwvkk"] Feb 28 04:25:27 crc kubenswrapper[5072]: I0228 04:25:27.372267 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wwvkk" Feb 28 04:25:27 crc kubenswrapper[5072]: I0228 04:25:27.383352 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wwvkk"] Feb 28 04:25:27 crc kubenswrapper[5072]: I0228 04:25:27.442011 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-727s7\" (UniqueName: \"kubernetes.io/projected/9be6a573-8f55-4fce-9180-7e4c8638dd41-kube-api-access-727s7\") pod \"redhat-marketplace-wwvkk\" (UID: \"9be6a573-8f55-4fce-9180-7e4c8638dd41\") " pod="openshift-marketplace/redhat-marketplace-wwvkk" Feb 28 04:25:27 crc kubenswrapper[5072]: I0228 04:25:27.442073 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9be6a573-8f55-4fce-9180-7e4c8638dd41-utilities\") pod \"redhat-marketplace-wwvkk\" (UID: \"9be6a573-8f55-4fce-9180-7e4c8638dd41\") " pod="openshift-marketplace/redhat-marketplace-wwvkk" Feb 28 04:25:27 crc kubenswrapper[5072]: I0228 04:25:27.442092 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9be6a573-8f55-4fce-9180-7e4c8638dd41-catalog-content\") pod \"redhat-marketplace-wwvkk\" (UID: \"9be6a573-8f55-4fce-9180-7e4c8638dd41\") " pod="openshift-marketplace/redhat-marketplace-wwvkk" Feb 28 04:25:27 crc kubenswrapper[5072]: I0228 04:25:27.543416 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-727s7\" (UniqueName: \"kubernetes.io/projected/9be6a573-8f55-4fce-9180-7e4c8638dd41-kube-api-access-727s7\") pod \"redhat-marketplace-wwvkk\" (UID: \"9be6a573-8f55-4fce-9180-7e4c8638dd41\") " pod="openshift-marketplace/redhat-marketplace-wwvkk" Feb 28 04:25:27 crc kubenswrapper[5072]: I0228 04:25:27.543495 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9be6a573-8f55-4fce-9180-7e4c8638dd41-catalog-content\") pod \"redhat-marketplace-wwvkk\" (UID: \"9be6a573-8f55-4fce-9180-7e4c8638dd41\") " pod="openshift-marketplace/redhat-marketplace-wwvkk" Feb 28 04:25:27 crc kubenswrapper[5072]: I0228 04:25:27.543517 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9be6a573-8f55-4fce-9180-7e4c8638dd41-utilities\") pod \"redhat-marketplace-wwvkk\" (UID: \"9be6a573-8f55-4fce-9180-7e4c8638dd41\") " pod="openshift-marketplace/redhat-marketplace-wwvkk" Feb 28 04:25:27 crc kubenswrapper[5072]: I0228 04:25:27.544053 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9be6a573-8f55-4fce-9180-7e4c8638dd41-catalog-content\") pod \"redhat-marketplace-wwvkk\" (UID: \"9be6a573-8f55-4fce-9180-7e4c8638dd41\") " pod="openshift-marketplace/redhat-marketplace-wwvkk" Feb 28 04:25:27 crc kubenswrapper[5072]: I0228 04:25:27.544127 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9be6a573-8f55-4fce-9180-7e4c8638dd41-utilities\") pod \"redhat-marketplace-wwvkk\" (UID: \"9be6a573-8f55-4fce-9180-7e4c8638dd41\") " pod="openshift-marketplace/redhat-marketplace-wwvkk" Feb 28 04:25:27 crc kubenswrapper[5072]: I0228 04:25:27.572396 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-727s7\" (UniqueName: \"kubernetes.io/projected/9be6a573-8f55-4fce-9180-7e4c8638dd41-kube-api-access-727s7\") pod \"redhat-marketplace-wwvkk\" (UID: \"9be6a573-8f55-4fce-9180-7e4c8638dd41\") " pod="openshift-marketplace/redhat-marketplace-wwvkk" Feb 28 04:25:27 crc kubenswrapper[5072]: I0228 04:25:27.761016 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wwvkk" Feb 28 04:25:28 crc kubenswrapper[5072]: I0228 04:25:28.098944 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wwvkk"] Feb 28 04:25:28 crc kubenswrapper[5072]: I0228 04:25:28.472933 5072 generic.go:334] "Generic (PLEG): container finished" podID="9be6a573-8f55-4fce-9180-7e4c8638dd41" containerID="68822d1cd6ddb3343d6ecc0a6c10aa6a095e9918c6b3f7f1e35437a7cfdb3f3f" exitCode=0 Feb 28 04:25:28 crc kubenswrapper[5072]: I0228 04:25:28.473017 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wwvkk" event={"ID":"9be6a573-8f55-4fce-9180-7e4c8638dd41","Type":"ContainerDied","Data":"68822d1cd6ddb3343d6ecc0a6c10aa6a095e9918c6b3f7f1e35437a7cfdb3f3f"} Feb 28 04:25:28 crc kubenswrapper[5072]: I0228 04:25:28.473061 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wwvkk" event={"ID":"9be6a573-8f55-4fce-9180-7e4c8638dd41","Type":"ContainerStarted","Data":"f6ee5f8adce6eae850c88336f78e0718c7462bc4421ffe1565fb83d5db20afd7"} Feb 28 04:25:30 crc kubenswrapper[5072]: I0228 04:25:30.486848 5072 generic.go:334] "Generic (PLEG): container finished" podID="9be6a573-8f55-4fce-9180-7e4c8638dd41" containerID="71019b2ac0650bb2a39a9640af4bb454163167271045265daf5c32393053ae2a" exitCode=0 Feb 28 04:25:30 crc kubenswrapper[5072]: I0228 04:25:30.486936 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wwvkk" event={"ID":"9be6a573-8f55-4fce-9180-7e4c8638dd41","Type":"ContainerDied","Data":"71019b2ac0650bb2a39a9640af4bb454163167271045265daf5c32393053ae2a"} Feb 28 04:25:31 crc kubenswrapper[5072]: I0228 04:25:31.497179 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wwvkk" event={"ID":"9be6a573-8f55-4fce-9180-7e4c8638dd41","Type":"ContainerStarted","Data":"208dce6da6f4b744d84caa39f44260fe76917d2506d96d23ed400ccb965b7111"} Feb 28 04:25:31 crc kubenswrapper[5072]: I0228 04:25:31.537224 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wwvkk" podStartSLOduration=2.133940153 podStartE2EDuration="4.537200059s" podCreationTimestamp="2026-02-28 04:25:27 +0000 UTC" firstStartedPulling="2026-02-28 04:25:28.474723359 +0000 UTC m=+950.469453551" lastFinishedPulling="2026-02-28 04:25:30.877983265 +0000 UTC m=+952.872713457" observedRunningTime="2026-02-28 04:25:31.531382218 +0000 UTC m=+953.526112420" watchObservedRunningTime="2026-02-28 04:25:31.537200059 +0000 UTC m=+953.531930241" Feb 28 04:25:31 crc kubenswrapper[5072]: I0228 04:25:31.927421 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-ss2pk"] Feb 28 04:25:31 crc kubenswrapper[5072]: I0228 04:25:31.928202 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-ss2pk" Feb 28 04:25:31 crc kubenswrapper[5072]: I0228 04:25:31.930898 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-rfdf7" Feb 28 04:25:31 crc kubenswrapper[5072]: I0228 04:25:31.940430 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-ss2pk"] Feb 28 04:25:32 crc kubenswrapper[5072]: I0228 04:25:32.009200 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn4bj\" (UniqueName: \"kubernetes.io/projected/cb49b1ff-eed4-41e8-a6e2-5b1514499d41-kube-api-access-pn4bj\") pod \"infra-operator-index-ss2pk\" (UID: \"cb49b1ff-eed4-41e8-a6e2-5b1514499d41\") " pod="openstack-operators/infra-operator-index-ss2pk" Feb 28 04:25:32 crc kubenswrapper[5072]: I0228 04:25:32.111410 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn4bj\" (UniqueName: \"kubernetes.io/projected/cb49b1ff-eed4-41e8-a6e2-5b1514499d41-kube-api-access-pn4bj\") pod \"infra-operator-index-ss2pk\" (UID: \"cb49b1ff-eed4-41e8-a6e2-5b1514499d41\") " pod="openstack-operators/infra-operator-index-ss2pk" Feb 28 04:25:32 crc kubenswrapper[5072]: I0228 04:25:32.147885 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn4bj\" (UniqueName: \"kubernetes.io/projected/cb49b1ff-eed4-41e8-a6e2-5b1514499d41-kube-api-access-pn4bj\") pod \"infra-operator-index-ss2pk\" (UID: \"cb49b1ff-eed4-41e8-a6e2-5b1514499d41\") " pod="openstack-operators/infra-operator-index-ss2pk" Feb 28 04:25:32 crc kubenswrapper[5072]: I0228 04:25:32.250859 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-ss2pk" Feb 28 04:25:32 crc kubenswrapper[5072]: I0228 04:25:32.704318 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-ss2pk"] Feb 28 04:25:33 crc kubenswrapper[5072]: I0228 04:25:33.522207 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-ss2pk" event={"ID":"cb49b1ff-eed4-41e8-a6e2-5b1514499d41","Type":"ContainerStarted","Data":"33a482e26fd92d69b5a943ab156012bb1f0b2376f21b2e4da2149883d6e2f451"} Feb 28 04:25:34 crc kubenswrapper[5072]: I0228 04:25:34.534763 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-ss2pk" event={"ID":"cb49b1ff-eed4-41e8-a6e2-5b1514499d41","Type":"ContainerStarted","Data":"bc7db28d114c84e78b81aea7fef6c8ead4f3a486d05bfcbe325587bba3356109"} Feb 28 04:25:34 crc kubenswrapper[5072]: I0228 04:25:34.570787 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-ss2pk" podStartSLOduration=2.590315062 podStartE2EDuration="3.570757056s" podCreationTimestamp="2026-02-28 04:25:31 +0000 UTC" firstStartedPulling="2026-02-28 04:25:32.72049119 +0000 UTC m=+954.715221382" lastFinishedPulling="2026-02-28 04:25:33.700933184 +0000 UTC m=+955.695663376" observedRunningTime="2026-02-28 04:25:34.55547108 +0000 UTC m=+956.550201292" watchObservedRunningTime="2026-02-28 04:25:34.570757056 +0000 UTC m=+956.565487268" Feb 28 04:25:37 crc kubenswrapper[5072]: I0228 04:25:37.762020 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wwvkk" Feb 28 04:25:37 crc kubenswrapper[5072]: I0228 04:25:37.762364 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wwvkk" Feb 28 04:25:37 crc kubenswrapper[5072]: I0228 04:25:37.817627 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wwvkk" Feb 28 04:25:38 crc kubenswrapper[5072]: I0228 04:25:38.612711 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wwvkk" Feb 28 04:25:42 crc kubenswrapper[5072]: I0228 04:25:42.129333 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n8kpx"] Feb 28 04:25:42 crc kubenswrapper[5072]: I0228 04:25:42.130666 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n8kpx" Feb 28 04:25:42 crc kubenswrapper[5072]: I0228 04:25:42.142353 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n8kpx"] Feb 28 04:25:42 crc kubenswrapper[5072]: I0228 04:25:42.251129 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-ss2pk" Feb 28 04:25:42 crc kubenswrapper[5072]: I0228 04:25:42.252026 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-ss2pk" Feb 28 04:25:42 crc kubenswrapper[5072]: I0228 04:25:42.265240 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbjpp\" (UniqueName: \"kubernetes.io/projected/c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4-kube-api-access-zbjpp\") pod \"community-operators-n8kpx\" (UID: \"c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4\") " pod="openshift-marketplace/community-operators-n8kpx" Feb 28 04:25:42 crc kubenswrapper[5072]: I0228 04:25:42.265321 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4-catalog-content\") pod \"community-operators-n8kpx\" (UID: \"c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4\") " pod="openshift-marketplace/community-operators-n8kpx" Feb 28 04:25:42 crc kubenswrapper[5072]: I0228 04:25:42.265359 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4-utilities\") pod \"community-operators-n8kpx\" (UID: \"c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4\") " pod="openshift-marketplace/community-operators-n8kpx" Feb 28 04:25:42 crc kubenswrapper[5072]: I0228 04:25:42.313801 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-ss2pk" Feb 28 04:25:42 crc kubenswrapper[5072]: I0228 04:25:42.366605 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbjpp\" (UniqueName: \"kubernetes.io/projected/c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4-kube-api-access-zbjpp\") pod \"community-operators-n8kpx\" (UID: \"c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4\") " pod="openshift-marketplace/community-operators-n8kpx" Feb 28 04:25:42 crc kubenswrapper[5072]: I0228 04:25:42.367067 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4-catalog-content\") pod \"community-operators-n8kpx\" (UID: \"c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4\") " pod="openshift-marketplace/community-operators-n8kpx" Feb 28 04:25:42 crc kubenswrapper[5072]: I0228 04:25:42.367856 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4-catalog-content\") pod \"community-operators-n8kpx\" (UID: \"c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4\") " pod="openshift-marketplace/community-operators-n8kpx" Feb 28 04:25:42 crc kubenswrapper[5072]: I0228 04:25:42.367933 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4-utilities\") pod \"community-operators-n8kpx\" (UID: \"c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4\") " pod="openshift-marketplace/community-operators-n8kpx" Feb 28 04:25:42 crc kubenswrapper[5072]: I0228 04:25:42.368428 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4-utilities\") pod \"community-operators-n8kpx\" (UID: \"c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4\") " pod="openshift-marketplace/community-operators-n8kpx" Feb 28 04:25:42 crc kubenswrapper[5072]: I0228 04:25:42.395542 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbjpp\" (UniqueName: \"kubernetes.io/projected/c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4-kube-api-access-zbjpp\") pod \"community-operators-n8kpx\" (UID: \"c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4\") " pod="openshift-marketplace/community-operators-n8kpx" Feb 28 04:25:42 crc kubenswrapper[5072]: I0228 04:25:42.458898 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n8kpx" Feb 28 04:25:42 crc kubenswrapper[5072]: I0228 04:25:42.630904 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-ss2pk" Feb 28 04:25:42 crc kubenswrapper[5072]: I0228 04:25:42.959255 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n8kpx"] Feb 28 04:25:43 crc kubenswrapper[5072]: I0228 04:25:43.606549 5072 generic.go:334] "Generic (PLEG): container finished" podID="c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4" containerID="bc9818afffe575546822dce6498e6f138b5534b9ea5ad5fefd442d759e020761" exitCode=0 Feb 28 04:25:43 crc kubenswrapper[5072]: I0228 04:25:43.606651 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n8kpx" event={"ID":"c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4","Type":"ContainerDied","Data":"bc9818afffe575546822dce6498e6f138b5534b9ea5ad5fefd442d759e020761"} Feb 28 04:25:43 crc kubenswrapper[5072]: I0228 04:25:43.608754 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n8kpx" event={"ID":"c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4","Type":"ContainerStarted","Data":"b28a262b69d198f54d4a9c3ae63226fd6ce79fd435436055429c8b4d2f0ac571"} Feb 28 04:25:44 crc kubenswrapper[5072]: I0228 04:25:44.322053 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wwvkk"] Feb 28 04:25:44 crc kubenswrapper[5072]: I0228 04:25:44.322338 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wwvkk" podUID="9be6a573-8f55-4fce-9180-7e4c8638dd41" containerName="registry-server" containerID="cri-o://208dce6da6f4b744d84caa39f44260fe76917d2506d96d23ed400ccb965b7111" gracePeriod=2 Feb 28 04:25:44 crc kubenswrapper[5072]: I0228 04:25:44.626046 5072 generic.go:334] "Generic (PLEG): container finished" podID="9be6a573-8f55-4fce-9180-7e4c8638dd41" containerID="208dce6da6f4b744d84caa39f44260fe76917d2506d96d23ed400ccb965b7111" exitCode=0 Feb 28 04:25:44 crc kubenswrapper[5072]: I0228 04:25:44.626344 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wwvkk" event={"ID":"9be6a573-8f55-4fce-9180-7e4c8638dd41","Type":"ContainerDied","Data":"208dce6da6f4b744d84caa39f44260fe76917d2506d96d23ed400ccb965b7111"} Feb 28 04:25:44 crc kubenswrapper[5072]: I0228 04:25:44.629703 5072 generic.go:334] "Generic (PLEG): container finished" podID="c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4" containerID="d7450fd65feaa871a08e96accd5c3de56e564145d43f3e43ba00939da44d693d" exitCode=0 Feb 28 04:25:44 crc kubenswrapper[5072]: I0228 04:25:44.629734 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n8kpx" event={"ID":"c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4","Type":"ContainerDied","Data":"d7450fd65feaa871a08e96accd5c3de56e564145d43f3e43ba00939da44d693d"} Feb 28 04:25:44 crc kubenswrapper[5072]: I0228 04:25:44.696442 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wwvkk" Feb 28 04:25:44 crc kubenswrapper[5072]: I0228 04:25:44.821824 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9be6a573-8f55-4fce-9180-7e4c8638dd41-catalog-content\") pod \"9be6a573-8f55-4fce-9180-7e4c8638dd41\" (UID: \"9be6a573-8f55-4fce-9180-7e4c8638dd41\") " Feb 28 04:25:44 crc kubenswrapper[5072]: I0228 04:25:44.821928 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-727s7\" (UniqueName: \"kubernetes.io/projected/9be6a573-8f55-4fce-9180-7e4c8638dd41-kube-api-access-727s7\") pod \"9be6a573-8f55-4fce-9180-7e4c8638dd41\" (UID: \"9be6a573-8f55-4fce-9180-7e4c8638dd41\") " Feb 28 04:25:44 crc kubenswrapper[5072]: I0228 04:25:44.821983 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9be6a573-8f55-4fce-9180-7e4c8638dd41-utilities\") pod \"9be6a573-8f55-4fce-9180-7e4c8638dd41\" (UID: \"9be6a573-8f55-4fce-9180-7e4c8638dd41\") " Feb 28 04:25:44 crc kubenswrapper[5072]: I0228 04:25:44.822593 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9be6a573-8f55-4fce-9180-7e4c8638dd41-utilities" (OuterVolumeSpecName: "utilities") pod "9be6a573-8f55-4fce-9180-7e4c8638dd41" (UID: "9be6a573-8f55-4fce-9180-7e4c8638dd41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:25:44 crc kubenswrapper[5072]: I0228 04:25:44.827326 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9be6a573-8f55-4fce-9180-7e4c8638dd41-kube-api-access-727s7" (OuterVolumeSpecName: "kube-api-access-727s7") pod "9be6a573-8f55-4fce-9180-7e4c8638dd41" (UID: "9be6a573-8f55-4fce-9180-7e4c8638dd41"). InnerVolumeSpecName "kube-api-access-727s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:25:44 crc kubenswrapper[5072]: I0228 04:25:44.845855 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9be6a573-8f55-4fce-9180-7e4c8638dd41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9be6a573-8f55-4fce-9180-7e4c8638dd41" (UID: "9be6a573-8f55-4fce-9180-7e4c8638dd41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:25:44 crc kubenswrapper[5072]: I0228 04:25:44.923152 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-727s7\" (UniqueName: \"kubernetes.io/projected/9be6a573-8f55-4fce-9180-7e4c8638dd41-kube-api-access-727s7\") on node \"crc\" DevicePath \"\"" Feb 28 04:25:44 crc kubenswrapper[5072]: I0228 04:25:44.923195 5072 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9be6a573-8f55-4fce-9180-7e4c8638dd41-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 04:25:44 crc kubenswrapper[5072]: I0228 04:25:44.923211 5072 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9be6a573-8f55-4fce-9180-7e4c8638dd41-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 04:25:44 crc kubenswrapper[5072]: I0228 04:25:44.963968 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl"] Feb 28 04:25:44 crc kubenswrapper[5072]: E0228 04:25:44.964232 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9be6a573-8f55-4fce-9180-7e4c8638dd41" containerName="extract-utilities" Feb 28 04:25:44 crc kubenswrapper[5072]: I0228 04:25:44.964248 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="9be6a573-8f55-4fce-9180-7e4c8638dd41" containerName="extract-utilities" Feb 28 04:25:44 crc kubenswrapper[5072]: E0228 04:25:44.964259 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9be6a573-8f55-4fce-9180-7e4c8638dd41" containerName="extract-content" Feb 28 04:25:44 crc kubenswrapper[5072]: I0228 04:25:44.964267 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="9be6a573-8f55-4fce-9180-7e4c8638dd41" containerName="extract-content" Feb 28 04:25:44 crc kubenswrapper[5072]: E0228 04:25:44.964281 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9be6a573-8f55-4fce-9180-7e4c8638dd41" containerName="registry-server" Feb 28 04:25:44 crc kubenswrapper[5072]: I0228 04:25:44.964290 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="9be6a573-8f55-4fce-9180-7e4c8638dd41" containerName="registry-server" Feb 28 04:25:44 crc kubenswrapper[5072]: I0228 04:25:44.964419 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="9be6a573-8f55-4fce-9180-7e4c8638dd41" containerName="registry-server" Feb 28 04:25:44 crc kubenswrapper[5072]: I0228 04:25:44.965339 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl" Feb 28 04:25:44 crc kubenswrapper[5072]: I0228 04:25:44.967550 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-56tr7" Feb 28 04:25:44 crc kubenswrapper[5072]: I0228 04:25:44.974468 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl"] Feb 28 04:25:45 crc kubenswrapper[5072]: I0228 04:25:45.126311 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/05f93606-b8ad-4fbe-b8dc-b7c9722e450c-util\") pod \"79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl\" (UID: \"05f93606-b8ad-4fbe-b8dc-b7c9722e450c\") " pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl" Feb 28 04:25:45 crc kubenswrapper[5072]: I0228 04:25:45.126408 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t4dn\" (UniqueName: \"kubernetes.io/projected/05f93606-b8ad-4fbe-b8dc-b7c9722e450c-kube-api-access-8t4dn\") pod \"79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl\" (UID: \"05f93606-b8ad-4fbe-b8dc-b7c9722e450c\") " pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl" Feb 28 04:25:45 crc kubenswrapper[5072]: I0228 04:25:45.126471 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/05f93606-b8ad-4fbe-b8dc-b7c9722e450c-bundle\") pod \"79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl\" (UID: \"05f93606-b8ad-4fbe-b8dc-b7c9722e450c\") " pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl" Feb 28 04:25:45 crc kubenswrapper[5072]: I0228 04:25:45.227296 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/05f93606-b8ad-4fbe-b8dc-b7c9722e450c-bundle\") pod \"79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl\" (UID: \"05f93606-b8ad-4fbe-b8dc-b7c9722e450c\") " pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl" Feb 28 04:25:45 crc kubenswrapper[5072]: I0228 04:25:45.227764 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/05f93606-b8ad-4fbe-b8dc-b7c9722e450c-bundle\") pod \"79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl\" (UID: \"05f93606-b8ad-4fbe-b8dc-b7c9722e450c\") " pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl" Feb 28 04:25:45 crc kubenswrapper[5072]: I0228 04:25:45.227838 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/05f93606-b8ad-4fbe-b8dc-b7c9722e450c-util\") pod \"79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl\" (UID: \"05f93606-b8ad-4fbe-b8dc-b7c9722e450c\") " pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl" Feb 28 04:25:45 crc kubenswrapper[5072]: I0228 04:25:45.228066 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/05f93606-b8ad-4fbe-b8dc-b7c9722e450c-util\") pod \"79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl\" (UID: \"05f93606-b8ad-4fbe-b8dc-b7c9722e450c\") " pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl" Feb 28 04:25:45 crc kubenswrapper[5072]: I0228 04:25:45.228133 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t4dn\" (UniqueName: \"kubernetes.io/projected/05f93606-b8ad-4fbe-b8dc-b7c9722e450c-kube-api-access-8t4dn\") pod \"79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl\" (UID: \"05f93606-b8ad-4fbe-b8dc-b7c9722e450c\") " pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl" Feb 28 04:25:45 crc kubenswrapper[5072]: I0228 04:25:45.245206 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t4dn\" (UniqueName: \"kubernetes.io/projected/05f93606-b8ad-4fbe-b8dc-b7c9722e450c-kube-api-access-8t4dn\") pod \"79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl\" (UID: \"05f93606-b8ad-4fbe-b8dc-b7c9722e450c\") " pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl" Feb 28 04:25:45 crc kubenswrapper[5072]: I0228 04:25:45.279920 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl" Feb 28 04:25:45 crc kubenswrapper[5072]: I0228 04:25:45.480694 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl"] Feb 28 04:25:45 crc kubenswrapper[5072]: W0228 04:25:45.485981 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05f93606_b8ad_4fbe_b8dc_b7c9722e450c.slice/crio-18d1ad364fb0f92d83193b2494cc347cd8cc15f2a03abc7ae66031d8554f5a96 WatchSource:0}: Error finding container 18d1ad364fb0f92d83193b2494cc347cd8cc15f2a03abc7ae66031d8554f5a96: Status 404 returned error can't find the container with id 18d1ad364fb0f92d83193b2494cc347cd8cc15f2a03abc7ae66031d8554f5a96 Feb 28 04:25:45 crc kubenswrapper[5072]: I0228 04:25:45.638298 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n8kpx" event={"ID":"c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4","Type":"ContainerStarted","Data":"0b1362ab9447c22d873cc8dd377e07c2a1637cd5c5d7bcad469951e78fc26e9f"} Feb 28 04:25:45 crc kubenswrapper[5072]: I0228 04:25:45.641046 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wwvkk" event={"ID":"9be6a573-8f55-4fce-9180-7e4c8638dd41","Type":"ContainerDied","Data":"f6ee5f8adce6eae850c88336f78e0718c7462bc4421ffe1565fb83d5db20afd7"} Feb 28 04:25:45 crc kubenswrapper[5072]: I0228 04:25:45.641078 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wwvkk" Feb 28 04:25:45 crc kubenswrapper[5072]: I0228 04:25:45.641114 5072 scope.go:117] "RemoveContainer" containerID="208dce6da6f4b744d84caa39f44260fe76917d2506d96d23ed400ccb965b7111" Feb 28 04:25:45 crc kubenswrapper[5072]: I0228 04:25:45.642017 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl" event={"ID":"05f93606-b8ad-4fbe-b8dc-b7c9722e450c","Type":"ContainerStarted","Data":"18d1ad364fb0f92d83193b2494cc347cd8cc15f2a03abc7ae66031d8554f5a96"} Feb 28 04:25:45 crc kubenswrapper[5072]: I0228 04:25:45.658609 5072 scope.go:117] "RemoveContainer" containerID="71019b2ac0650bb2a39a9640af4bb454163167271045265daf5c32393053ae2a" Feb 28 04:25:45 crc kubenswrapper[5072]: I0228 04:25:45.682538 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n8kpx" podStartSLOduration=2.222408156 podStartE2EDuration="3.682522973s" podCreationTimestamp="2026-02-28 04:25:42 +0000 UTC" firstStartedPulling="2026-02-28 04:25:43.608072413 +0000 UTC m=+965.602802605" lastFinishedPulling="2026-02-28 04:25:45.06818723 +0000 UTC m=+967.062917422" observedRunningTime="2026-02-28 04:25:45.681026416 +0000 UTC m=+967.675756608" watchObservedRunningTime="2026-02-28 04:25:45.682522973 +0000 UTC m=+967.677253165" Feb 28 04:25:45 crc kubenswrapper[5072]: I0228 04:25:45.704693 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wwvkk"] Feb 28 04:25:45 crc kubenswrapper[5072]: I0228 04:25:45.712970 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wwvkk"] Feb 28 04:25:45 crc kubenswrapper[5072]: I0228 04:25:45.714252 5072 scope.go:117] "RemoveContainer" containerID="68822d1cd6ddb3343d6ecc0a6c10aa6a095e9918c6b3f7f1e35437a7cfdb3f3f" Feb 28 04:25:46 crc kubenswrapper[5072]: I0228 04:25:46.650957 5072 generic.go:334] "Generic (PLEG): container finished" podID="05f93606-b8ad-4fbe-b8dc-b7c9722e450c" containerID="dfe9c897421e35a94eeef77a6f2ff37f92cf75e56fd8cb883528baa35adef364" exitCode=0 Feb 28 04:25:46 crc kubenswrapper[5072]: I0228 04:25:46.651063 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl" event={"ID":"05f93606-b8ad-4fbe-b8dc-b7c9722e450c","Type":"ContainerDied","Data":"dfe9c897421e35a94eeef77a6f2ff37f92cf75e56fd8cb883528baa35adef364"} Feb 28 04:25:46 crc kubenswrapper[5072]: I0228 04:25:46.670266 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9be6a573-8f55-4fce-9180-7e4c8638dd41" path="/var/lib/kubelet/pods/9be6a573-8f55-4fce-9180-7e4c8638dd41/volumes" Feb 28 04:25:47 crc kubenswrapper[5072]: I0228 04:25:47.667119 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl" event={"ID":"05f93606-b8ad-4fbe-b8dc-b7c9722e450c","Type":"ContainerStarted","Data":"809343c9ee42f6592affc2b8749731fd47732d8e9729c316750331a32d6adba9"} Feb 28 04:25:48 crc kubenswrapper[5072]: I0228 04:25:48.678365 5072 generic.go:334] "Generic (PLEG): container finished" podID="05f93606-b8ad-4fbe-b8dc-b7c9722e450c" containerID="809343c9ee42f6592affc2b8749731fd47732d8e9729c316750331a32d6adba9" exitCode=0 Feb 28 04:25:48 crc kubenswrapper[5072]: I0228 04:25:48.678456 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl" event={"ID":"05f93606-b8ad-4fbe-b8dc-b7c9722e450c","Type":"ContainerDied","Data":"809343c9ee42f6592affc2b8749731fd47732d8e9729c316750331a32d6adba9"} Feb 28 04:25:49 crc kubenswrapper[5072]: I0228 04:25:49.692529 5072 generic.go:334] "Generic (PLEG): container finished" podID="05f93606-b8ad-4fbe-b8dc-b7c9722e450c" containerID="35888c4b552ad7c4d2ba2dea0a3ad2136b741607e5dfc7779234fafc29e8a90a" exitCode=0 Feb 28 04:25:49 crc kubenswrapper[5072]: I0228 04:25:49.692933 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl" event={"ID":"05f93606-b8ad-4fbe-b8dc-b7c9722e450c","Type":"ContainerDied","Data":"35888c4b552ad7c4d2ba2dea0a3ad2136b741607e5dfc7779234fafc29e8a90a"} Feb 28 04:25:51 crc kubenswrapper[5072]: I0228 04:25:51.014756 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl" Feb 28 04:25:51 crc kubenswrapper[5072]: I0228 04:25:51.122803 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t4dn\" (UniqueName: \"kubernetes.io/projected/05f93606-b8ad-4fbe-b8dc-b7c9722e450c-kube-api-access-8t4dn\") pod \"05f93606-b8ad-4fbe-b8dc-b7c9722e450c\" (UID: \"05f93606-b8ad-4fbe-b8dc-b7c9722e450c\") " Feb 28 04:25:51 crc kubenswrapper[5072]: I0228 04:25:51.122865 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/05f93606-b8ad-4fbe-b8dc-b7c9722e450c-bundle\") pod \"05f93606-b8ad-4fbe-b8dc-b7c9722e450c\" (UID: \"05f93606-b8ad-4fbe-b8dc-b7c9722e450c\") " Feb 28 04:25:51 crc kubenswrapper[5072]: I0228 04:25:51.122934 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/05f93606-b8ad-4fbe-b8dc-b7c9722e450c-util\") pod \"05f93606-b8ad-4fbe-b8dc-b7c9722e450c\" (UID: \"05f93606-b8ad-4fbe-b8dc-b7c9722e450c\") " Feb 28 04:25:51 crc kubenswrapper[5072]: I0228 04:25:51.126422 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05f93606-b8ad-4fbe-b8dc-b7c9722e450c-bundle" (OuterVolumeSpecName: "bundle") pod "05f93606-b8ad-4fbe-b8dc-b7c9722e450c" (UID: "05f93606-b8ad-4fbe-b8dc-b7c9722e450c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:25:51 crc kubenswrapper[5072]: I0228 04:25:51.130285 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05f93606-b8ad-4fbe-b8dc-b7c9722e450c-kube-api-access-8t4dn" (OuterVolumeSpecName: "kube-api-access-8t4dn") pod "05f93606-b8ad-4fbe-b8dc-b7c9722e450c" (UID: "05f93606-b8ad-4fbe-b8dc-b7c9722e450c"). InnerVolumeSpecName "kube-api-access-8t4dn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:25:51 crc kubenswrapper[5072]: I0228 04:25:51.225112 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8t4dn\" (UniqueName: \"kubernetes.io/projected/05f93606-b8ad-4fbe-b8dc-b7c9722e450c-kube-api-access-8t4dn\") on node \"crc\" DevicePath \"\"" Feb 28 04:25:51 crc kubenswrapper[5072]: I0228 04:25:51.225348 5072 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/05f93606-b8ad-4fbe-b8dc-b7c9722e450c-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 04:25:51 crc kubenswrapper[5072]: I0228 04:25:51.484546 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05f93606-b8ad-4fbe-b8dc-b7c9722e450c-util" (OuterVolumeSpecName: "util") pod "05f93606-b8ad-4fbe-b8dc-b7c9722e450c" (UID: "05f93606-b8ad-4fbe-b8dc-b7c9722e450c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:25:51 crc kubenswrapper[5072]: I0228 04:25:51.530217 5072 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/05f93606-b8ad-4fbe-b8dc-b7c9722e450c-util\") on node \"crc\" DevicePath \"\"" Feb 28 04:25:51 crc kubenswrapper[5072]: I0228 04:25:51.728507 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl" event={"ID":"05f93606-b8ad-4fbe-b8dc-b7c9722e450c","Type":"ContainerDied","Data":"18d1ad364fb0f92d83193b2494cc347cd8cc15f2a03abc7ae66031d8554f5a96"} Feb 28 04:25:51 crc kubenswrapper[5072]: I0228 04:25:51.728615 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18d1ad364fb0f92d83193b2494cc347cd8cc15f2a03abc7ae66031d8554f5a96" Feb 28 04:25:51 crc kubenswrapper[5072]: I0228 04:25:51.728720 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl" Feb 28 04:25:52 crc kubenswrapper[5072]: I0228 04:25:52.459856 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n8kpx" Feb 28 04:25:52 crc kubenswrapper[5072]: I0228 04:25:52.459902 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n8kpx" Feb 28 04:25:52 crc kubenswrapper[5072]: I0228 04:25:52.519503 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n8kpx" Feb 28 04:25:52 crc kubenswrapper[5072]: I0228 04:25:52.788568 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n8kpx" Feb 28 04:25:56 crc kubenswrapper[5072]: I0228 04:25:56.727025 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x4kgr"] Feb 28 04:25:56 crc kubenswrapper[5072]: E0228 04:25:56.727265 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05f93606-b8ad-4fbe-b8dc-b7c9722e450c" containerName="extract" Feb 28 04:25:56 crc kubenswrapper[5072]: I0228 04:25:56.727276 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="05f93606-b8ad-4fbe-b8dc-b7c9722e450c" containerName="extract" Feb 28 04:25:56 crc kubenswrapper[5072]: E0228 04:25:56.727286 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05f93606-b8ad-4fbe-b8dc-b7c9722e450c" containerName="pull" Feb 28 04:25:56 crc kubenswrapper[5072]: I0228 04:25:56.727291 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="05f93606-b8ad-4fbe-b8dc-b7c9722e450c" containerName="pull" Feb 28 04:25:56 crc kubenswrapper[5072]: E0228 04:25:56.727298 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05f93606-b8ad-4fbe-b8dc-b7c9722e450c" containerName="util" Feb 28 04:25:56 crc kubenswrapper[5072]: I0228 04:25:56.727305 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="05f93606-b8ad-4fbe-b8dc-b7c9722e450c" containerName="util" Feb 28 04:25:56 crc kubenswrapper[5072]: I0228 04:25:56.727410 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="05f93606-b8ad-4fbe-b8dc-b7c9722e450c" containerName="extract" Feb 28 04:25:56 crc kubenswrapper[5072]: I0228 04:25:56.728294 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x4kgr" Feb 28 04:25:56 crc kubenswrapper[5072]: I0228 04:25:56.756250 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x4kgr"] Feb 28 04:25:56 crc kubenswrapper[5072]: I0228 04:25:56.812734 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsm4n\" (UniqueName: \"kubernetes.io/projected/31fa42ba-a85d-47c3-a663-dd23c6029522-kube-api-access-fsm4n\") pod \"certified-operators-x4kgr\" (UID: \"31fa42ba-a85d-47c3-a663-dd23c6029522\") " pod="openshift-marketplace/certified-operators-x4kgr" Feb 28 04:25:56 crc kubenswrapper[5072]: I0228 04:25:56.812796 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31fa42ba-a85d-47c3-a663-dd23c6029522-catalog-content\") pod \"certified-operators-x4kgr\" (UID: \"31fa42ba-a85d-47c3-a663-dd23c6029522\") " pod="openshift-marketplace/certified-operators-x4kgr" Feb 28 04:25:56 crc kubenswrapper[5072]: I0228 04:25:56.812844 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31fa42ba-a85d-47c3-a663-dd23c6029522-utilities\") pod \"certified-operators-x4kgr\" (UID: \"31fa42ba-a85d-47c3-a663-dd23c6029522\") " pod="openshift-marketplace/certified-operators-x4kgr" Feb 28 04:25:56 crc kubenswrapper[5072]: I0228 04:25:56.913988 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31fa42ba-a85d-47c3-a663-dd23c6029522-catalog-content\") pod \"certified-operators-x4kgr\" (UID: \"31fa42ba-a85d-47c3-a663-dd23c6029522\") " pod="openshift-marketplace/certified-operators-x4kgr" Feb 28 04:25:56 crc kubenswrapper[5072]: I0228 04:25:56.914073 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31fa42ba-a85d-47c3-a663-dd23c6029522-utilities\") pod \"certified-operators-x4kgr\" (UID: \"31fa42ba-a85d-47c3-a663-dd23c6029522\") " pod="openshift-marketplace/certified-operators-x4kgr" Feb 28 04:25:56 crc kubenswrapper[5072]: I0228 04:25:56.914179 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsm4n\" (UniqueName: \"kubernetes.io/projected/31fa42ba-a85d-47c3-a663-dd23c6029522-kube-api-access-fsm4n\") pod \"certified-operators-x4kgr\" (UID: \"31fa42ba-a85d-47c3-a663-dd23c6029522\") " pod="openshift-marketplace/certified-operators-x4kgr" Feb 28 04:25:56 crc kubenswrapper[5072]: I0228 04:25:56.914688 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31fa42ba-a85d-47c3-a663-dd23c6029522-catalog-content\") pod \"certified-operators-x4kgr\" (UID: \"31fa42ba-a85d-47c3-a663-dd23c6029522\") " pod="openshift-marketplace/certified-operators-x4kgr" Feb 28 04:25:56 crc kubenswrapper[5072]: I0228 04:25:56.914781 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31fa42ba-a85d-47c3-a663-dd23c6029522-utilities\") pod \"certified-operators-x4kgr\" (UID: \"31fa42ba-a85d-47c3-a663-dd23c6029522\") " pod="openshift-marketplace/certified-operators-x4kgr" Feb 28 04:25:56 crc kubenswrapper[5072]: I0228 04:25:56.934046 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsm4n\" (UniqueName: \"kubernetes.io/projected/31fa42ba-a85d-47c3-a663-dd23c6029522-kube-api-access-fsm4n\") pod \"certified-operators-x4kgr\" (UID: \"31fa42ba-a85d-47c3-a663-dd23c6029522\") " pod="openshift-marketplace/certified-operators-x4kgr" Feb 28 04:25:57 crc kubenswrapper[5072]: I0228 04:25:57.044999 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x4kgr" Feb 28 04:25:57 crc kubenswrapper[5072]: I0228 04:25:57.293786 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x4kgr"] Feb 28 04:25:57 crc kubenswrapper[5072]: W0228 04:25:57.305249 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31fa42ba_a85d_47c3_a663_dd23c6029522.slice/crio-86369fbc5af1bf7066cebfee2fe760ceb01e3338c1b7141f8218d4675b5903d3 WatchSource:0}: Error finding container 86369fbc5af1bf7066cebfee2fe760ceb01e3338c1b7141f8218d4675b5903d3: Status 404 returned error can't find the container with id 86369fbc5af1bf7066cebfee2fe760ceb01e3338c1b7141f8218d4675b5903d3 Feb 28 04:25:57 crc kubenswrapper[5072]: I0228 04:25:57.323680 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n8kpx"] Feb 28 04:25:57 crc kubenswrapper[5072]: I0228 04:25:57.324053 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n8kpx" podUID="c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4" containerName="registry-server" containerID="cri-o://0b1362ab9447c22d873cc8dd377e07c2a1637cd5c5d7bcad469951e78fc26e9f" gracePeriod=2 Feb 28 04:25:57 crc kubenswrapper[5072]: I0228 04:25:57.725211 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n8kpx" Feb 28 04:25:57 crc kubenswrapper[5072]: I0228 04:25:57.776483 5072 generic.go:334] "Generic (PLEG): container finished" podID="c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4" containerID="0b1362ab9447c22d873cc8dd377e07c2a1637cd5c5d7bcad469951e78fc26e9f" exitCode=0 Feb 28 04:25:57 crc kubenswrapper[5072]: I0228 04:25:57.776543 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n8kpx" event={"ID":"c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4","Type":"ContainerDied","Data":"0b1362ab9447c22d873cc8dd377e07c2a1637cd5c5d7bcad469951e78fc26e9f"} Feb 28 04:25:57 crc kubenswrapper[5072]: I0228 04:25:57.776546 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n8kpx" Feb 28 04:25:57 crc kubenswrapper[5072]: I0228 04:25:57.776569 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n8kpx" event={"ID":"c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4","Type":"ContainerDied","Data":"b28a262b69d198f54d4a9c3ae63226fd6ce79fd435436055429c8b4d2f0ac571"} Feb 28 04:25:57 crc kubenswrapper[5072]: I0228 04:25:57.776583 5072 scope.go:117] "RemoveContainer" containerID="0b1362ab9447c22d873cc8dd377e07c2a1637cd5c5d7bcad469951e78fc26e9f" Feb 28 04:25:57 crc kubenswrapper[5072]: I0228 04:25:57.778452 5072 generic.go:334] "Generic (PLEG): container finished" podID="31fa42ba-a85d-47c3-a663-dd23c6029522" containerID="c0648f8fc1c5eb5d40aaf8f199539ff1b6d0f28492742be08d5225e88dca420e" exitCode=0 Feb 28 04:25:57 crc kubenswrapper[5072]: I0228 04:25:57.778482 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4kgr" event={"ID":"31fa42ba-a85d-47c3-a663-dd23c6029522","Type":"ContainerDied","Data":"c0648f8fc1c5eb5d40aaf8f199539ff1b6d0f28492742be08d5225e88dca420e"} Feb 28 04:25:57 crc kubenswrapper[5072]: I0228 04:25:57.778501 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4kgr" event={"ID":"31fa42ba-a85d-47c3-a663-dd23c6029522","Type":"ContainerStarted","Data":"86369fbc5af1bf7066cebfee2fe760ceb01e3338c1b7141f8218d4675b5903d3"} Feb 28 04:25:57 crc kubenswrapper[5072]: I0228 04:25:57.800851 5072 scope.go:117] "RemoveContainer" containerID="d7450fd65feaa871a08e96accd5c3de56e564145d43f3e43ba00939da44d693d" Feb 28 04:25:57 crc kubenswrapper[5072]: I0228 04:25:57.820730 5072 scope.go:117] "RemoveContainer" containerID="bc9818afffe575546822dce6498e6f138b5534b9ea5ad5fefd442d759e020761" Feb 28 04:25:57 crc kubenswrapper[5072]: I0228 04:25:57.825705 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4-catalog-content\") pod \"c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4\" (UID: \"c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4\") " Feb 28 04:25:57 crc kubenswrapper[5072]: I0228 04:25:57.825758 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbjpp\" (UniqueName: \"kubernetes.io/projected/c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4-kube-api-access-zbjpp\") pod \"c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4\" (UID: \"c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4\") " Feb 28 04:25:57 crc kubenswrapper[5072]: I0228 04:25:57.825804 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4-utilities\") pod \"c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4\" (UID: \"c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4\") " Feb 28 04:25:57 crc kubenswrapper[5072]: I0228 04:25:57.828100 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4-utilities" (OuterVolumeSpecName: "utilities") pod "c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4" (UID: "c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:25:57 crc kubenswrapper[5072]: I0228 04:25:57.833736 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4-kube-api-access-zbjpp" (OuterVolumeSpecName: "kube-api-access-zbjpp") pod "c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4" (UID: "c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4"). InnerVolumeSpecName "kube-api-access-zbjpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:25:57 crc kubenswrapper[5072]: I0228 04:25:57.836786 5072 scope.go:117] "RemoveContainer" containerID="0b1362ab9447c22d873cc8dd377e07c2a1637cd5c5d7bcad469951e78fc26e9f" Feb 28 04:25:57 crc kubenswrapper[5072]: E0228 04:25:57.837122 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b1362ab9447c22d873cc8dd377e07c2a1637cd5c5d7bcad469951e78fc26e9f\": container with ID starting with 0b1362ab9447c22d873cc8dd377e07c2a1637cd5c5d7bcad469951e78fc26e9f not found: ID does not exist" containerID="0b1362ab9447c22d873cc8dd377e07c2a1637cd5c5d7bcad469951e78fc26e9f" Feb 28 04:25:57 crc kubenswrapper[5072]: I0228 04:25:57.837151 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b1362ab9447c22d873cc8dd377e07c2a1637cd5c5d7bcad469951e78fc26e9f"} err="failed to get container status \"0b1362ab9447c22d873cc8dd377e07c2a1637cd5c5d7bcad469951e78fc26e9f\": rpc error: code = NotFound desc = could not find container \"0b1362ab9447c22d873cc8dd377e07c2a1637cd5c5d7bcad469951e78fc26e9f\": container with ID starting with 0b1362ab9447c22d873cc8dd377e07c2a1637cd5c5d7bcad469951e78fc26e9f not found: ID does not exist" Feb 28 04:25:57 crc kubenswrapper[5072]: I0228 04:25:57.837170 5072 scope.go:117] "RemoveContainer" containerID="d7450fd65feaa871a08e96accd5c3de56e564145d43f3e43ba00939da44d693d" Feb 28 04:25:57 crc kubenswrapper[5072]: E0228 04:25:57.837377 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7450fd65feaa871a08e96accd5c3de56e564145d43f3e43ba00939da44d693d\": container with ID starting with d7450fd65feaa871a08e96accd5c3de56e564145d43f3e43ba00939da44d693d not found: ID does not exist" containerID="d7450fd65feaa871a08e96accd5c3de56e564145d43f3e43ba00939da44d693d" Feb 28 04:25:57 crc kubenswrapper[5072]: I0228 04:25:57.837396 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7450fd65feaa871a08e96accd5c3de56e564145d43f3e43ba00939da44d693d"} err="failed to get container status \"d7450fd65feaa871a08e96accd5c3de56e564145d43f3e43ba00939da44d693d\": rpc error: code = NotFound desc = could not find container \"d7450fd65feaa871a08e96accd5c3de56e564145d43f3e43ba00939da44d693d\": container with ID starting with d7450fd65feaa871a08e96accd5c3de56e564145d43f3e43ba00939da44d693d not found: ID does not exist" Feb 28 04:25:57 crc kubenswrapper[5072]: I0228 04:25:57.837410 5072 scope.go:117] "RemoveContainer" containerID="bc9818afffe575546822dce6498e6f138b5534b9ea5ad5fefd442d759e020761" Feb 28 04:25:57 crc kubenswrapper[5072]: E0228 04:25:57.837564 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc9818afffe575546822dce6498e6f138b5534b9ea5ad5fefd442d759e020761\": container with ID starting with bc9818afffe575546822dce6498e6f138b5534b9ea5ad5fefd442d759e020761 not found: ID does not exist" containerID="bc9818afffe575546822dce6498e6f138b5534b9ea5ad5fefd442d759e020761" Feb 28 04:25:57 crc kubenswrapper[5072]: I0228 04:25:57.837583 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc9818afffe575546822dce6498e6f138b5534b9ea5ad5fefd442d759e020761"} err="failed to get container status \"bc9818afffe575546822dce6498e6f138b5534b9ea5ad5fefd442d759e020761\": rpc error: code = NotFound desc = could not find container \"bc9818afffe575546822dce6498e6f138b5534b9ea5ad5fefd442d759e020761\": container with ID starting with bc9818afffe575546822dce6498e6f138b5534b9ea5ad5fefd442d759e020761 not found: ID does not exist" Feb 28 04:25:57 crc kubenswrapper[5072]: I0228 04:25:57.877560 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4" (UID: "c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:25:57 crc kubenswrapper[5072]: I0228 04:25:57.927539 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbjpp\" (UniqueName: \"kubernetes.io/projected/c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4-kube-api-access-zbjpp\") on node \"crc\" DevicePath \"\"" Feb 28 04:25:57 crc kubenswrapper[5072]: I0228 04:25:57.927577 5072 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 04:25:57 crc kubenswrapper[5072]: I0228 04:25:57.927599 5072 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 04:25:58 crc kubenswrapper[5072]: I0228 04:25:58.103996 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n8kpx"] Feb 28 04:25:58 crc kubenswrapper[5072]: I0228 04:25:58.110137 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n8kpx"] Feb 28 04:25:58 crc kubenswrapper[5072]: I0228 04:25:58.665877 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4" path="/var/lib/kubelet/pods/c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4/volumes" Feb 28 04:25:58 crc kubenswrapper[5072]: I0228 04:25:58.786844 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4kgr" event={"ID":"31fa42ba-a85d-47c3-a663-dd23c6029522","Type":"ContainerStarted","Data":"b1288b9bfaaa1de01cb5ff2b8b37a4d37e97ab1d585a6323072e00116085e7d4"} Feb 28 04:25:59 crc kubenswrapper[5072]: I0228 04:25:59.793729 5072 generic.go:334] "Generic (PLEG): container finished" podID="31fa42ba-a85d-47c3-a663-dd23c6029522" containerID="b1288b9bfaaa1de01cb5ff2b8b37a4d37e97ab1d585a6323072e00116085e7d4" exitCode=0 Feb 28 04:25:59 crc kubenswrapper[5072]: I0228 04:25:59.794048 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4kgr" event={"ID":"31fa42ba-a85d-47c3-a663-dd23c6029522","Type":"ContainerDied","Data":"b1288b9bfaaa1de01cb5ff2b8b37a4d37e97ab1d585a6323072e00116085e7d4"} Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.121775 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537546-ztbp6"] Feb 28 04:26:00 crc kubenswrapper[5072]: E0228 04:26:00.122003 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4" containerName="registry-server" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.122015 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4" containerName="registry-server" Feb 28 04:26:00 crc kubenswrapper[5072]: E0228 04:26:00.122031 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4" containerName="extract-utilities" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.122037 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4" containerName="extract-utilities" Feb 28 04:26:00 crc kubenswrapper[5072]: E0228 04:26:00.122051 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4" containerName="extract-content" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.122060 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4" containerName="extract-content" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.122148 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="c17f73ef-d92d-4ef3-b889-6eeaee4a5ea4" containerName="registry-server" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.122588 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537546-ztbp6" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.124823 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.125090 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-c8kvx" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.125222 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.147662 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537546-ztbp6"] Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.160258 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4qh9\" (UniqueName: \"kubernetes.io/projected/1bb6cfd5-00f1-4028-b63e-96effbd865f0-kube-api-access-j4qh9\") pod \"auto-csr-approver-29537546-ztbp6\" (UID: \"1bb6cfd5-00f1-4028-b63e-96effbd865f0\") " pod="openshift-infra/auto-csr-approver-29537546-ztbp6" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.261535 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4qh9\" (UniqueName: \"kubernetes.io/projected/1bb6cfd5-00f1-4028-b63e-96effbd865f0-kube-api-access-j4qh9\") pod \"auto-csr-approver-29537546-ztbp6\" (UID: \"1bb6cfd5-00f1-4028-b63e-96effbd865f0\") " pod="openshift-infra/auto-csr-approver-29537546-ztbp6" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.292920 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4qh9\" (UniqueName: \"kubernetes.io/projected/1bb6cfd5-00f1-4028-b63e-96effbd865f0-kube-api-access-j4qh9\") pod \"auto-csr-approver-29537546-ztbp6\" (UID: \"1bb6cfd5-00f1-4028-b63e-96effbd865f0\") " pod="openshift-infra/auto-csr-approver-29537546-ztbp6" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.439568 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537546-ztbp6" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.531409 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/openstack-galera-0"] Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.532357 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-0" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.536437 5072 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"galera-openstack-dockercfg-98rq2" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.536501 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"openstack-config-data" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.536665 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"openshift-service-ca.crt" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.536771 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"kube-root-ca.crt" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.537990 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"openstack-scripts" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.548030 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/openstack-galera-0"] Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.556261 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/openstack-galera-2"] Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.557360 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-2" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.565163 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e56491ab-6d17-4127-a25b-75b5e900e0aa-kolla-config\") pod \"openstack-galera-0\" (UID: \"e56491ab-6d17-4127-a25b-75b5e900e0aa\") " pod="horizon-kuttl-tests/openstack-galera-0" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.565223 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szqmd\" (UniqueName: \"kubernetes.io/projected/e56491ab-6d17-4127-a25b-75b5e900e0aa-kube-api-access-szqmd\") pod \"openstack-galera-0\" (UID: \"e56491ab-6d17-4127-a25b-75b5e900e0aa\") " pod="horizon-kuttl-tests/openstack-galera-0" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.565252 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"e56491ab-6d17-4127-a25b-75b5e900e0aa\") " pod="horizon-kuttl-tests/openstack-galera-0" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.565274 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e56491ab-6d17-4127-a25b-75b5e900e0aa-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e56491ab-6d17-4127-a25b-75b5e900e0aa\") " pod="horizon-kuttl-tests/openstack-galera-0" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.565302 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e56491ab-6d17-4127-a25b-75b5e900e0aa-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e56491ab-6d17-4127-a25b-75b5e900e0aa\") " pod="horizon-kuttl-tests/openstack-galera-0" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.565342 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e56491ab-6d17-4127-a25b-75b5e900e0aa-config-data-default\") pod \"openstack-galera-0\" (UID: \"e56491ab-6d17-4127-a25b-75b5e900e0aa\") " pod="horizon-kuttl-tests/openstack-galera-0" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.567848 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/openstack-galera-1"] Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.568751 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-1" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.594480 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/openstack-galera-2"] Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.599746 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/openstack-galera-1"] Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.666724 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e56491ab-6d17-4127-a25b-75b5e900e0aa-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e56491ab-6d17-4127-a25b-75b5e900e0aa\") " pod="horizon-kuttl-tests/openstack-galera-0" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.666773 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c25f535-2cfb-40b6-9412-9888a0fc1975-operator-scripts\") pod \"openstack-galera-2\" (UID: \"9c25f535-2cfb-40b6-9412-9888a0fc1975\") " pod="horizon-kuttl-tests/openstack-galera-2" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.666803 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e56491ab-6d17-4127-a25b-75b5e900e0aa-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e56491ab-6d17-4127-a25b-75b5e900e0aa\") " pod="horizon-kuttl-tests/openstack-galera-0" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.666828 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8171cc83-a178-4d19-b1c5-0d93b123838c-config-data-generated\") pod \"openstack-galera-1\" (UID: \"8171cc83-a178-4d19-b1c5-0d93b123838c\") " pod="horizon-kuttl-tests/openstack-galera-1" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.666850 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8171cc83-a178-4d19-b1c5-0d93b123838c-kolla-config\") pod \"openstack-galera-1\" (UID: \"8171cc83-a178-4d19-b1c5-0d93b123838c\") " pod="horizon-kuttl-tests/openstack-galera-1" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.666881 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-2\" (UID: \"9c25f535-2cfb-40b6-9412-9888a0fc1975\") " pod="horizon-kuttl-tests/openstack-galera-2" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.666897 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9c25f535-2cfb-40b6-9412-9888a0fc1975-config-data-generated\") pod \"openstack-galera-2\" (UID: \"9c25f535-2cfb-40b6-9412-9888a0fc1975\") " pod="horizon-kuttl-tests/openstack-galera-2" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.666916 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9c25f535-2cfb-40b6-9412-9888a0fc1975-kolla-config\") pod \"openstack-galera-2\" (UID: \"9c25f535-2cfb-40b6-9412-9888a0fc1975\") " pod="horizon-kuttl-tests/openstack-galera-2" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.666934 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e56491ab-6d17-4127-a25b-75b5e900e0aa-config-data-default\") pod \"openstack-galera-0\" (UID: \"e56491ab-6d17-4127-a25b-75b5e900e0aa\") " pod="horizon-kuttl-tests/openstack-galera-0" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.666952 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phj5n\" (UniqueName: \"kubernetes.io/projected/8171cc83-a178-4d19-b1c5-0d93b123838c-kube-api-access-phj5n\") pod \"openstack-galera-1\" (UID: \"8171cc83-a178-4d19-b1c5-0d93b123838c\") " pod="horizon-kuttl-tests/openstack-galera-1" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.666968 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68nsk\" (UniqueName: \"kubernetes.io/projected/9c25f535-2cfb-40b6-9412-9888a0fc1975-kube-api-access-68nsk\") pod \"openstack-galera-2\" (UID: \"9c25f535-2cfb-40b6-9412-9888a0fc1975\") " pod="horizon-kuttl-tests/openstack-galera-2" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.666991 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-1\" (UID: \"8171cc83-a178-4d19-b1c5-0d93b123838c\") " pod="horizon-kuttl-tests/openstack-galera-1" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.667006 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9c25f535-2cfb-40b6-9412-9888a0fc1975-config-data-default\") pod \"openstack-galera-2\" (UID: \"9c25f535-2cfb-40b6-9412-9888a0fc1975\") " pod="horizon-kuttl-tests/openstack-galera-2" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.667031 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e56491ab-6d17-4127-a25b-75b5e900e0aa-kolla-config\") pod \"openstack-galera-0\" (UID: \"e56491ab-6d17-4127-a25b-75b5e900e0aa\") " pod="horizon-kuttl-tests/openstack-galera-0" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.667054 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8171cc83-a178-4d19-b1c5-0d93b123838c-operator-scripts\") pod \"openstack-galera-1\" (UID: \"8171cc83-a178-4d19-b1c5-0d93b123838c\") " pod="horizon-kuttl-tests/openstack-galera-1" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.667076 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8171cc83-a178-4d19-b1c5-0d93b123838c-config-data-default\") pod \"openstack-galera-1\" (UID: \"8171cc83-a178-4d19-b1c5-0d93b123838c\") " pod="horizon-kuttl-tests/openstack-galera-1" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.667104 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szqmd\" (UniqueName: \"kubernetes.io/projected/e56491ab-6d17-4127-a25b-75b5e900e0aa-kube-api-access-szqmd\") pod \"openstack-galera-0\" (UID: \"e56491ab-6d17-4127-a25b-75b5e900e0aa\") " pod="horizon-kuttl-tests/openstack-galera-0" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.667126 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"e56491ab-6d17-4127-a25b-75b5e900e0aa\") " pod="horizon-kuttl-tests/openstack-galera-0" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.667518 5072 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"e56491ab-6d17-4127-a25b-75b5e900e0aa\") device mount path \"/mnt/openstack/pv01\"" pod="horizon-kuttl-tests/openstack-galera-0" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.668547 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e56491ab-6d17-4127-a25b-75b5e900e0aa-config-data-default\") pod \"openstack-galera-0\" (UID: \"e56491ab-6d17-4127-a25b-75b5e900e0aa\") " pod="horizon-kuttl-tests/openstack-galera-0" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.668724 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e56491ab-6d17-4127-a25b-75b5e900e0aa-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e56491ab-6d17-4127-a25b-75b5e900e0aa\") " pod="horizon-kuttl-tests/openstack-galera-0" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.668977 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e56491ab-6d17-4127-a25b-75b5e900e0aa-kolla-config\") pod \"openstack-galera-0\" (UID: \"e56491ab-6d17-4127-a25b-75b5e900e0aa\") " pod="horizon-kuttl-tests/openstack-galera-0" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.669530 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e56491ab-6d17-4127-a25b-75b5e900e0aa-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e56491ab-6d17-4127-a25b-75b5e900e0aa\") " pod="horizon-kuttl-tests/openstack-galera-0" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.685054 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szqmd\" (UniqueName: \"kubernetes.io/projected/e56491ab-6d17-4127-a25b-75b5e900e0aa-kube-api-access-szqmd\") pod \"openstack-galera-0\" (UID: \"e56491ab-6d17-4127-a25b-75b5e900e0aa\") " pod="horizon-kuttl-tests/openstack-galera-0" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.686274 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"e56491ab-6d17-4127-a25b-75b5e900e0aa\") " pod="horizon-kuttl-tests/openstack-galera-0" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.768731 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-1\" (UID: \"8171cc83-a178-4d19-b1c5-0d93b123838c\") " pod="horizon-kuttl-tests/openstack-galera-1" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.768806 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9c25f535-2cfb-40b6-9412-9888a0fc1975-config-data-default\") pod \"openstack-galera-2\" (UID: \"9c25f535-2cfb-40b6-9412-9888a0fc1975\") " pod="horizon-kuttl-tests/openstack-galera-2" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.768867 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8171cc83-a178-4d19-b1c5-0d93b123838c-operator-scripts\") pod \"openstack-galera-1\" (UID: \"8171cc83-a178-4d19-b1c5-0d93b123838c\") " pod="horizon-kuttl-tests/openstack-galera-1" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.768887 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8171cc83-a178-4d19-b1c5-0d93b123838c-config-data-default\") pod \"openstack-galera-1\" (UID: \"8171cc83-a178-4d19-b1c5-0d93b123838c\") " pod="horizon-kuttl-tests/openstack-galera-1" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.768933 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c25f535-2cfb-40b6-9412-9888a0fc1975-operator-scripts\") pod \"openstack-galera-2\" (UID: \"9c25f535-2cfb-40b6-9412-9888a0fc1975\") " pod="horizon-kuttl-tests/openstack-galera-2" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.768932 5072 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-1\" (UID: \"8171cc83-a178-4d19-b1c5-0d93b123838c\") device mount path \"/mnt/openstack/pv11\"" pod="horizon-kuttl-tests/openstack-galera-1" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.769957 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8171cc83-a178-4d19-b1c5-0d93b123838c-config-data-default\") pod \"openstack-galera-1\" (UID: \"8171cc83-a178-4d19-b1c5-0d93b123838c\") " pod="horizon-kuttl-tests/openstack-galera-1" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.771020 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9c25f535-2cfb-40b6-9412-9888a0fc1975-config-data-default\") pod \"openstack-galera-2\" (UID: \"9c25f535-2cfb-40b6-9412-9888a0fc1975\") " pod="horizon-kuttl-tests/openstack-galera-2" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.771224 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8171cc83-a178-4d19-b1c5-0d93b123838c-config-data-generated\") pod \"openstack-galera-1\" (UID: \"8171cc83-a178-4d19-b1c5-0d93b123838c\") " pod="horizon-kuttl-tests/openstack-galera-1" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.771274 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8171cc83-a178-4d19-b1c5-0d93b123838c-kolla-config\") pod \"openstack-galera-1\" (UID: \"8171cc83-a178-4d19-b1c5-0d93b123838c\") " pod="horizon-kuttl-tests/openstack-galera-1" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.771384 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-2\" (UID: \"9c25f535-2cfb-40b6-9412-9888a0fc1975\") " pod="horizon-kuttl-tests/openstack-galera-2" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.771424 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9c25f535-2cfb-40b6-9412-9888a0fc1975-config-data-generated\") pod \"openstack-galera-2\" (UID: \"9c25f535-2cfb-40b6-9412-9888a0fc1975\") " pod="horizon-kuttl-tests/openstack-galera-2" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.771469 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9c25f535-2cfb-40b6-9412-9888a0fc1975-kolla-config\") pod \"openstack-galera-2\" (UID: \"9c25f535-2cfb-40b6-9412-9888a0fc1975\") " pod="horizon-kuttl-tests/openstack-galera-2" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.771503 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phj5n\" (UniqueName: \"kubernetes.io/projected/8171cc83-a178-4d19-b1c5-0d93b123838c-kube-api-access-phj5n\") pod \"openstack-galera-1\" (UID: \"8171cc83-a178-4d19-b1c5-0d93b123838c\") " pod="horizon-kuttl-tests/openstack-galera-1" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.771537 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68nsk\" (UniqueName: \"kubernetes.io/projected/9c25f535-2cfb-40b6-9412-9888a0fc1975-kube-api-access-68nsk\") pod \"openstack-galera-2\" (UID: \"9c25f535-2cfb-40b6-9412-9888a0fc1975\") " pod="horizon-kuttl-tests/openstack-galera-2" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.771603 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c25f535-2cfb-40b6-9412-9888a0fc1975-operator-scripts\") pod \"openstack-galera-2\" (UID: \"9c25f535-2cfb-40b6-9412-9888a0fc1975\") " pod="horizon-kuttl-tests/openstack-galera-2" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.772010 5072 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-2\" (UID: \"9c25f535-2cfb-40b6-9412-9888a0fc1975\") device mount path \"/mnt/openstack/pv07\"" pod="horizon-kuttl-tests/openstack-galera-2" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.772242 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8171cc83-a178-4d19-b1c5-0d93b123838c-config-data-generated\") pod \"openstack-galera-1\" (UID: \"8171cc83-a178-4d19-b1c5-0d93b123838c\") " pod="horizon-kuttl-tests/openstack-galera-1" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.772432 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8171cc83-a178-4d19-b1c5-0d93b123838c-operator-scripts\") pod \"openstack-galera-1\" (UID: \"8171cc83-a178-4d19-b1c5-0d93b123838c\") " pod="horizon-kuttl-tests/openstack-galera-1" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.772845 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8171cc83-a178-4d19-b1c5-0d93b123838c-kolla-config\") pod \"openstack-galera-1\" (UID: \"8171cc83-a178-4d19-b1c5-0d93b123838c\") " pod="horizon-kuttl-tests/openstack-galera-1" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.773146 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9c25f535-2cfb-40b6-9412-9888a0fc1975-config-data-generated\") pod \"openstack-galera-2\" (UID: \"9c25f535-2cfb-40b6-9412-9888a0fc1975\") " pod="horizon-kuttl-tests/openstack-galera-2" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.773242 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9c25f535-2cfb-40b6-9412-9888a0fc1975-kolla-config\") pod \"openstack-galera-2\" (UID: \"9c25f535-2cfb-40b6-9412-9888a0fc1975\") " pod="horizon-kuttl-tests/openstack-galera-2" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.786976 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-1\" (UID: \"8171cc83-a178-4d19-b1c5-0d93b123838c\") " pod="horizon-kuttl-tests/openstack-galera-1" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.787610 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-2\" (UID: \"9c25f535-2cfb-40b6-9412-9888a0fc1975\") " pod="horizon-kuttl-tests/openstack-galera-2" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.788174 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68nsk\" (UniqueName: \"kubernetes.io/projected/9c25f535-2cfb-40b6-9412-9888a0fc1975-kube-api-access-68nsk\") pod \"openstack-galera-2\" (UID: \"9c25f535-2cfb-40b6-9412-9888a0fc1975\") " pod="horizon-kuttl-tests/openstack-galera-2" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.788753 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phj5n\" (UniqueName: \"kubernetes.io/projected/8171cc83-a178-4d19-b1c5-0d93b123838c-kube-api-access-phj5n\") pod \"openstack-galera-1\" (UID: \"8171cc83-a178-4d19-b1c5-0d93b123838c\") " pod="horizon-kuttl-tests/openstack-galera-1" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.802857 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4kgr" event={"ID":"31fa42ba-a85d-47c3-a663-dd23c6029522","Type":"ContainerStarted","Data":"1b72b09570a354b98adb7d95e34485ae0b70c0f1df23d2dba66c856a5f7a0fd6"} Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.821254 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x4kgr" podStartSLOduration=2.399261737 podStartE2EDuration="4.821234494s" podCreationTimestamp="2026-02-28 04:25:56 +0000 UTC" firstStartedPulling="2026-02-28 04:25:57.77954459 +0000 UTC m=+979.774274772" lastFinishedPulling="2026-02-28 04:26:00.201517337 +0000 UTC m=+982.196247529" observedRunningTime="2026-02-28 04:26:00.818336203 +0000 UTC m=+982.813066405" watchObservedRunningTime="2026-02-28 04:26:00.821234494 +0000 UTC m=+982.815964686" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.854051 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-0" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.854892 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537546-ztbp6"] Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.886711 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-2" Feb 28 04:26:00 crc kubenswrapper[5072]: I0228 04:26:00.893859 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-1" Feb 28 04:26:01 crc kubenswrapper[5072]: I0228 04:26:01.117574 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/openstack-galera-0"] Feb 28 04:26:01 crc kubenswrapper[5072]: W0228 04:26:01.123528 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode56491ab_6d17_4127_a25b_75b5e900e0aa.slice/crio-940dd5a230baf5ea65ef57598a5bfd55ce14a6b0e23580326c809965c27f2c2d WatchSource:0}: Error finding container 940dd5a230baf5ea65ef57598a5bfd55ce14a6b0e23580326c809965c27f2c2d: Status 404 returned error can't find the container with id 940dd5a230baf5ea65ef57598a5bfd55ce14a6b0e23580326c809965c27f2c2d Feb 28 04:26:01 crc kubenswrapper[5072]: I0228 04:26:01.360801 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/openstack-galera-1"] Feb 28 04:26:01 crc kubenswrapper[5072]: W0228 04:26:01.360896 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8171cc83_a178_4d19_b1c5_0d93b123838c.slice/crio-27007c39a2131d9b8e6c4702a297dd6d1e6d6c5632bd74698ac8f31dbc6f51e1 WatchSource:0}: Error finding container 27007c39a2131d9b8e6c4702a297dd6d1e6d6c5632bd74698ac8f31dbc6f51e1: Status 404 returned error can't find the container with id 27007c39a2131d9b8e6c4702a297dd6d1e6d6c5632bd74698ac8f31dbc6f51e1 Feb 28 04:26:01 crc kubenswrapper[5072]: I0228 04:26:01.372298 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/openstack-galera-2"] Feb 28 04:26:01 crc kubenswrapper[5072]: I0228 04:26:01.816833 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-2" event={"ID":"9c25f535-2cfb-40b6-9412-9888a0fc1975","Type":"ContainerStarted","Data":"ce607371b58b0d39e69ca7abef7ff061fd840b7592e4e67caf4f4b8c094d4730"} Feb 28 04:26:01 crc kubenswrapper[5072]: I0228 04:26:01.817430 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-1" event={"ID":"8171cc83-a178-4d19-b1c5-0d93b123838c","Type":"ContainerStarted","Data":"27007c39a2131d9b8e6c4702a297dd6d1e6d6c5632bd74698ac8f31dbc6f51e1"} Feb 28 04:26:01 crc kubenswrapper[5072]: I0228 04:26:01.820971 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537546-ztbp6" event={"ID":"1bb6cfd5-00f1-4028-b63e-96effbd865f0","Type":"ContainerStarted","Data":"945b7956015969f901fa80b2c50fd22d04df114a0f82f2593114e4c9cc393b11"} Feb 28 04:26:01 crc kubenswrapper[5072]: I0228 04:26:01.824512 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-0" event={"ID":"e56491ab-6d17-4127-a25b-75b5e900e0aa","Type":"ContainerStarted","Data":"940dd5a230baf5ea65ef57598a5bfd55ce14a6b0e23580326c809965c27f2c2d"} Feb 28 04:26:02 crc kubenswrapper[5072]: I0228 04:26:02.262872 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-8c869c9f9-dnsxj"] Feb 28 04:26:02 crc kubenswrapper[5072]: I0228 04:26:02.263695 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-8c869c9f9-dnsxj" Feb 28 04:26:02 crc kubenswrapper[5072]: I0228 04:26:02.267589 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-n5knq" Feb 28 04:26:02 crc kubenswrapper[5072]: I0228 04:26:02.267588 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Feb 28 04:26:02 crc kubenswrapper[5072]: I0228 04:26:02.285059 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-8c869c9f9-dnsxj"] Feb 28 04:26:02 crc kubenswrapper[5072]: I0228 04:26:02.301077 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw6tm\" (UniqueName: \"kubernetes.io/projected/23fe0761-ad11-4ccf-9511-2c074bed0915-kube-api-access-kw6tm\") pod \"infra-operator-controller-manager-8c869c9f9-dnsxj\" (UID: \"23fe0761-ad11-4ccf-9511-2c074bed0915\") " pod="openstack-operators/infra-operator-controller-manager-8c869c9f9-dnsxj" Feb 28 04:26:02 crc kubenswrapper[5072]: I0228 04:26:02.301211 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/23fe0761-ad11-4ccf-9511-2c074bed0915-apiservice-cert\") pod \"infra-operator-controller-manager-8c869c9f9-dnsxj\" (UID: \"23fe0761-ad11-4ccf-9511-2c074bed0915\") " pod="openstack-operators/infra-operator-controller-manager-8c869c9f9-dnsxj" Feb 28 04:26:02 crc kubenswrapper[5072]: I0228 04:26:02.301280 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/23fe0761-ad11-4ccf-9511-2c074bed0915-webhook-cert\") pod \"infra-operator-controller-manager-8c869c9f9-dnsxj\" (UID: \"23fe0761-ad11-4ccf-9511-2c074bed0915\") " pod="openstack-operators/infra-operator-controller-manager-8c869c9f9-dnsxj" Feb 28 04:26:02 crc kubenswrapper[5072]: I0228 04:26:02.403347 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/23fe0761-ad11-4ccf-9511-2c074bed0915-apiservice-cert\") pod \"infra-operator-controller-manager-8c869c9f9-dnsxj\" (UID: \"23fe0761-ad11-4ccf-9511-2c074bed0915\") " pod="openstack-operators/infra-operator-controller-manager-8c869c9f9-dnsxj" Feb 28 04:26:02 crc kubenswrapper[5072]: I0228 04:26:02.403434 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/23fe0761-ad11-4ccf-9511-2c074bed0915-webhook-cert\") pod \"infra-operator-controller-manager-8c869c9f9-dnsxj\" (UID: \"23fe0761-ad11-4ccf-9511-2c074bed0915\") " pod="openstack-operators/infra-operator-controller-manager-8c869c9f9-dnsxj" Feb 28 04:26:02 crc kubenswrapper[5072]: I0228 04:26:02.403543 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw6tm\" (UniqueName: \"kubernetes.io/projected/23fe0761-ad11-4ccf-9511-2c074bed0915-kube-api-access-kw6tm\") pod \"infra-operator-controller-manager-8c869c9f9-dnsxj\" (UID: \"23fe0761-ad11-4ccf-9511-2c074bed0915\") " pod="openstack-operators/infra-operator-controller-manager-8c869c9f9-dnsxj" Feb 28 04:26:02 crc kubenswrapper[5072]: I0228 04:26:02.411497 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/23fe0761-ad11-4ccf-9511-2c074bed0915-apiservice-cert\") pod \"infra-operator-controller-manager-8c869c9f9-dnsxj\" (UID: \"23fe0761-ad11-4ccf-9511-2c074bed0915\") " pod="openstack-operators/infra-operator-controller-manager-8c869c9f9-dnsxj" Feb 28 04:26:02 crc kubenswrapper[5072]: I0228 04:26:02.426585 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw6tm\" (UniqueName: \"kubernetes.io/projected/23fe0761-ad11-4ccf-9511-2c074bed0915-kube-api-access-kw6tm\") pod \"infra-operator-controller-manager-8c869c9f9-dnsxj\" (UID: \"23fe0761-ad11-4ccf-9511-2c074bed0915\") " pod="openstack-operators/infra-operator-controller-manager-8c869c9f9-dnsxj" Feb 28 04:26:02 crc kubenswrapper[5072]: I0228 04:26:02.431580 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/23fe0761-ad11-4ccf-9511-2c074bed0915-webhook-cert\") pod \"infra-operator-controller-manager-8c869c9f9-dnsxj\" (UID: \"23fe0761-ad11-4ccf-9511-2c074bed0915\") " pod="openstack-operators/infra-operator-controller-manager-8c869c9f9-dnsxj" Feb 28 04:26:02 crc kubenswrapper[5072]: I0228 04:26:02.598297 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-8c869c9f9-dnsxj" Feb 28 04:26:02 crc kubenswrapper[5072]: I0228 04:26:02.833413 5072 generic.go:334] "Generic (PLEG): container finished" podID="1bb6cfd5-00f1-4028-b63e-96effbd865f0" containerID="e31182fb92a2f9129ff8de4bee9c0da29c7b6970056ab9089b995d03577aa6b3" exitCode=0 Feb 28 04:26:02 crc kubenswrapper[5072]: I0228 04:26:02.833589 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537546-ztbp6" event={"ID":"1bb6cfd5-00f1-4028-b63e-96effbd865f0","Type":"ContainerDied","Data":"e31182fb92a2f9129ff8de4bee9c0da29c7b6970056ab9089b995d03577aa6b3"} Feb 28 04:26:02 crc kubenswrapper[5072]: I0228 04:26:02.889235 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-8c869c9f9-dnsxj"] Feb 28 04:26:02 crc kubenswrapper[5072]: W0228 04:26:02.902885 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23fe0761_ad11_4ccf_9511_2c074bed0915.slice/crio-2b4eb3fb04285d1bf97e1c24b4543d0ed3c9548abf47cc22df098795bf07120d WatchSource:0}: Error finding container 2b4eb3fb04285d1bf97e1c24b4543d0ed3c9548abf47cc22df098795bf07120d: Status 404 returned error can't find the container with id 2b4eb3fb04285d1bf97e1c24b4543d0ed3c9548abf47cc22df098795bf07120d Feb 28 04:26:03 crc kubenswrapper[5072]: I0228 04:26:03.843686 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-8c869c9f9-dnsxj" event={"ID":"23fe0761-ad11-4ccf-9511-2c074bed0915","Type":"ContainerStarted","Data":"2b4eb3fb04285d1bf97e1c24b4543d0ed3c9548abf47cc22df098795bf07120d"} Feb 28 04:26:04 crc kubenswrapper[5072]: I0228 04:26:04.168614 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537546-ztbp6" Feb 28 04:26:04 crc kubenswrapper[5072]: I0228 04:26:04.249041 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4qh9\" (UniqueName: \"kubernetes.io/projected/1bb6cfd5-00f1-4028-b63e-96effbd865f0-kube-api-access-j4qh9\") pod \"1bb6cfd5-00f1-4028-b63e-96effbd865f0\" (UID: \"1bb6cfd5-00f1-4028-b63e-96effbd865f0\") " Feb 28 04:26:04 crc kubenswrapper[5072]: I0228 04:26:04.257827 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bb6cfd5-00f1-4028-b63e-96effbd865f0-kube-api-access-j4qh9" (OuterVolumeSpecName: "kube-api-access-j4qh9") pod "1bb6cfd5-00f1-4028-b63e-96effbd865f0" (UID: "1bb6cfd5-00f1-4028-b63e-96effbd865f0"). InnerVolumeSpecName "kube-api-access-j4qh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:26:04 crc kubenswrapper[5072]: I0228 04:26:04.350544 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4qh9\" (UniqueName: \"kubernetes.io/projected/1bb6cfd5-00f1-4028-b63e-96effbd865f0-kube-api-access-j4qh9\") on node \"crc\" DevicePath \"\"" Feb 28 04:26:04 crc kubenswrapper[5072]: I0228 04:26:04.859567 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537546-ztbp6" Feb 28 04:26:04 crc kubenswrapper[5072]: I0228 04:26:04.859486 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537546-ztbp6" event={"ID":"1bb6cfd5-00f1-4028-b63e-96effbd865f0","Type":"ContainerDied","Data":"945b7956015969f901fa80b2c50fd22d04df114a0f82f2593114e4c9cc393b11"} Feb 28 04:26:04 crc kubenswrapper[5072]: I0228 04:26:04.862169 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="945b7956015969f901fa80b2c50fd22d04df114a0f82f2593114e4c9cc393b11" Feb 28 04:26:05 crc kubenswrapper[5072]: I0228 04:26:05.219911 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537540-rcqfw"] Feb 28 04:26:05 crc kubenswrapper[5072]: I0228 04:26:05.225404 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537540-rcqfw"] Feb 28 04:26:06 crc kubenswrapper[5072]: I0228 04:26:06.674421 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7aa2b36-997c-4e7e-b869-a116e9e9fd74" path="/var/lib/kubelet/pods/e7aa2b36-997c-4e7e-b869-a116e9e9fd74/volumes" Feb 28 04:26:07 crc kubenswrapper[5072]: I0228 04:26:07.046161 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x4kgr" Feb 28 04:26:07 crc kubenswrapper[5072]: I0228 04:26:07.046209 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x4kgr" Feb 28 04:26:07 crc kubenswrapper[5072]: I0228 04:26:07.094747 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x4kgr" Feb 28 04:26:07 crc kubenswrapper[5072]: I0228 04:26:07.926552 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x4kgr" Feb 28 04:26:11 crc kubenswrapper[5072]: I0228 04:26:11.120129 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x4kgr"] Feb 28 04:26:11 crc kubenswrapper[5072]: I0228 04:26:11.120693 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x4kgr" podUID="31fa42ba-a85d-47c3-a663-dd23c6029522" containerName="registry-server" containerID="cri-o://1b72b09570a354b98adb7d95e34485ae0b70c0f1df23d2dba66c856a5f7a0fd6" gracePeriod=2 Feb 28 04:26:11 crc kubenswrapper[5072]: I0228 04:26:11.905458 5072 generic.go:334] "Generic (PLEG): container finished" podID="31fa42ba-a85d-47c3-a663-dd23c6029522" containerID="1b72b09570a354b98adb7d95e34485ae0b70c0f1df23d2dba66c856a5f7a0fd6" exitCode=0 Feb 28 04:26:11 crc kubenswrapper[5072]: I0228 04:26:11.905499 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4kgr" event={"ID":"31fa42ba-a85d-47c3-a663-dd23c6029522","Type":"ContainerDied","Data":"1b72b09570a354b98adb7d95e34485ae0b70c0f1df23d2dba66c856a5f7a0fd6"} Feb 28 04:26:12 crc kubenswrapper[5072]: I0228 04:26:12.494195 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x4kgr" Feb 28 04:26:12 crc kubenswrapper[5072]: I0228 04:26:12.630964 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31fa42ba-a85d-47c3-a663-dd23c6029522-utilities\") pod \"31fa42ba-a85d-47c3-a663-dd23c6029522\" (UID: \"31fa42ba-a85d-47c3-a663-dd23c6029522\") " Feb 28 04:26:12 crc kubenswrapper[5072]: I0228 04:26:12.631604 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31fa42ba-a85d-47c3-a663-dd23c6029522-catalog-content\") pod \"31fa42ba-a85d-47c3-a663-dd23c6029522\" (UID: \"31fa42ba-a85d-47c3-a663-dd23c6029522\") " Feb 28 04:26:12 crc kubenswrapper[5072]: I0228 04:26:12.631814 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsm4n\" (UniqueName: \"kubernetes.io/projected/31fa42ba-a85d-47c3-a663-dd23c6029522-kube-api-access-fsm4n\") pod \"31fa42ba-a85d-47c3-a663-dd23c6029522\" (UID: \"31fa42ba-a85d-47c3-a663-dd23c6029522\") " Feb 28 04:26:12 crc kubenswrapper[5072]: I0228 04:26:12.632241 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31fa42ba-a85d-47c3-a663-dd23c6029522-utilities" (OuterVolumeSpecName: "utilities") pod "31fa42ba-a85d-47c3-a663-dd23c6029522" (UID: "31fa42ba-a85d-47c3-a663-dd23c6029522"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:26:12 crc kubenswrapper[5072]: I0228 04:26:12.638784 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31fa42ba-a85d-47c3-a663-dd23c6029522-kube-api-access-fsm4n" (OuterVolumeSpecName: "kube-api-access-fsm4n") pod "31fa42ba-a85d-47c3-a663-dd23c6029522" (UID: "31fa42ba-a85d-47c3-a663-dd23c6029522"). InnerVolumeSpecName "kube-api-access-fsm4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:26:12 crc kubenswrapper[5072]: I0228 04:26:12.696850 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31fa42ba-a85d-47c3-a663-dd23c6029522-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31fa42ba-a85d-47c3-a663-dd23c6029522" (UID: "31fa42ba-a85d-47c3-a663-dd23c6029522"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:26:12 crc kubenswrapper[5072]: I0228 04:26:12.734737 5072 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31fa42ba-a85d-47c3-a663-dd23c6029522-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 04:26:12 crc kubenswrapper[5072]: I0228 04:26:12.734845 5072 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31fa42ba-a85d-47c3-a663-dd23c6029522-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 04:26:12 crc kubenswrapper[5072]: I0228 04:26:12.734861 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsm4n\" (UniqueName: \"kubernetes.io/projected/31fa42ba-a85d-47c3-a663-dd23c6029522-kube-api-access-fsm4n\") on node \"crc\" DevicePath \"\"" Feb 28 04:26:12 crc kubenswrapper[5072]: I0228 04:26:12.919116 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4kgr" event={"ID":"31fa42ba-a85d-47c3-a663-dd23c6029522","Type":"ContainerDied","Data":"86369fbc5af1bf7066cebfee2fe760ceb01e3338c1b7141f8218d4675b5903d3"} Feb 28 04:26:12 crc kubenswrapper[5072]: I0228 04:26:12.919211 5072 scope.go:117] "RemoveContainer" containerID="1b72b09570a354b98adb7d95e34485ae0b70c0f1df23d2dba66c856a5f7a0fd6" Feb 28 04:26:12 crc kubenswrapper[5072]: I0228 04:26:12.919417 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x4kgr" Feb 28 04:26:12 crc kubenswrapper[5072]: I0228 04:26:12.926332 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-1" event={"ID":"8171cc83-a178-4d19-b1c5-0d93b123838c","Type":"ContainerStarted","Data":"13db49bfc01e9ad1500509adf9d8d82fce556e0f4800f95044b7df4fcc6f98fb"} Feb 28 04:26:12 crc kubenswrapper[5072]: I0228 04:26:12.929269 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-8c869c9f9-dnsxj" event={"ID":"23fe0761-ad11-4ccf-9511-2c074bed0915","Type":"ContainerStarted","Data":"25c3d12861e02db8ba4c769ef2860dce9708121ac8e36e8c8df8a17ecfd1effa"} Feb 28 04:26:12 crc kubenswrapper[5072]: I0228 04:26:12.929441 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-8c869c9f9-dnsxj" Feb 28 04:26:12 crc kubenswrapper[5072]: I0228 04:26:12.932171 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-0" event={"ID":"e56491ab-6d17-4127-a25b-75b5e900e0aa","Type":"ContainerStarted","Data":"6e8560317d9c9bf7f2fe0437bb4714865b3a34fb141398d91668d549d485296a"} Feb 28 04:26:12 crc kubenswrapper[5072]: I0228 04:26:12.946253 5072 scope.go:117] "RemoveContainer" containerID="b1288b9bfaaa1de01cb5ff2b8b37a4d37e97ab1d585a6323072e00116085e7d4" Feb 28 04:26:12 crc kubenswrapper[5072]: I0228 04:26:12.947095 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-2" event={"ID":"9c25f535-2cfb-40b6-9412-9888a0fc1975","Type":"ContainerStarted","Data":"e3c37a111fa54a6762e88d7d2f359136c4f47029dbd2260a0a6dfcf5fcdc8e2f"} Feb 28 04:26:12 crc kubenswrapper[5072]: I0228 04:26:12.969984 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-8c869c9f9-dnsxj" podStartSLOduration=1.558031449 podStartE2EDuration="10.969960995s" podCreationTimestamp="2026-02-28 04:26:02 +0000 UTC" firstStartedPulling="2026-02-28 04:26:02.9059363 +0000 UTC m=+984.900666492" lastFinishedPulling="2026-02-28 04:26:12.317865846 +0000 UTC m=+994.312596038" observedRunningTime="2026-02-28 04:26:12.956420302 +0000 UTC m=+994.951150524" watchObservedRunningTime="2026-02-28 04:26:12.969960995 +0000 UTC m=+994.964691187" Feb 28 04:26:12 crc kubenswrapper[5072]: I0228 04:26:12.987038 5072 scope.go:117] "RemoveContainer" containerID="c0648f8fc1c5eb5d40aaf8f199539ff1b6d0f28492742be08d5225e88dca420e" Feb 28 04:26:13 crc kubenswrapper[5072]: I0228 04:26:13.048917 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x4kgr"] Feb 28 04:26:13 crc kubenswrapper[5072]: I0228 04:26:13.056665 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x4kgr"] Feb 28 04:26:14 crc kubenswrapper[5072]: I0228 04:26:14.666918 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31fa42ba-a85d-47c3-a663-dd23c6029522" path="/var/lib/kubelet/pods/31fa42ba-a85d-47c3-a663-dd23c6029522/volumes" Feb 28 04:26:16 crc kubenswrapper[5072]: I0228 04:26:16.983283 5072 generic.go:334] "Generic (PLEG): container finished" podID="e56491ab-6d17-4127-a25b-75b5e900e0aa" containerID="6e8560317d9c9bf7f2fe0437bb4714865b3a34fb141398d91668d549d485296a" exitCode=0 Feb 28 04:26:16 crc kubenswrapper[5072]: I0228 04:26:16.983617 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-0" event={"ID":"e56491ab-6d17-4127-a25b-75b5e900e0aa","Type":"ContainerDied","Data":"6e8560317d9c9bf7f2fe0437bb4714865b3a34fb141398d91668d549d485296a"} Feb 28 04:26:16 crc kubenswrapper[5072]: I0228 04:26:16.988318 5072 generic.go:334] "Generic (PLEG): container finished" podID="9c25f535-2cfb-40b6-9412-9888a0fc1975" containerID="e3c37a111fa54a6762e88d7d2f359136c4f47029dbd2260a0a6dfcf5fcdc8e2f" exitCode=0 Feb 28 04:26:16 crc kubenswrapper[5072]: I0228 04:26:16.988444 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-2" event={"ID":"9c25f535-2cfb-40b6-9412-9888a0fc1975","Type":"ContainerDied","Data":"e3c37a111fa54a6762e88d7d2f359136c4f47029dbd2260a0a6dfcf5fcdc8e2f"} Feb 28 04:26:16 crc kubenswrapper[5072]: I0228 04:26:16.992920 5072 generic.go:334] "Generic (PLEG): container finished" podID="8171cc83-a178-4d19-b1c5-0d93b123838c" containerID="13db49bfc01e9ad1500509adf9d8d82fce556e0f4800f95044b7df4fcc6f98fb" exitCode=0 Feb 28 04:26:16 crc kubenswrapper[5072]: I0228 04:26:16.992989 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-1" event={"ID":"8171cc83-a178-4d19-b1c5-0d93b123838c","Type":"ContainerDied","Data":"13db49bfc01e9ad1500509adf9d8d82fce556e0f4800f95044b7df4fcc6f98fb"} Feb 28 04:26:18 crc kubenswrapper[5072]: I0228 04:26:18.001489 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-1" event={"ID":"8171cc83-a178-4d19-b1c5-0d93b123838c","Type":"ContainerStarted","Data":"866c9368dc2e231135ded0ed584c6786cba66d438faaeff2f1a3fc3e0eb68a2b"} Feb 28 04:26:18 crc kubenswrapper[5072]: I0228 04:26:18.004181 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-0" event={"ID":"e56491ab-6d17-4127-a25b-75b5e900e0aa","Type":"ContainerStarted","Data":"22923bc7fb6b00b2c45a6c2fe17bc3e8e1d8c9956b429372dfce46aca53cb8f7"} Feb 28 04:26:18 crc kubenswrapper[5072]: I0228 04:26:18.007625 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-2" event={"ID":"9c25f535-2cfb-40b6-9412-9888a0fc1975","Type":"ContainerStarted","Data":"97553569d9278a79ed1a5d79f9d3aa79d091aabb454a56adbe2614f1c6dc6f9e"} Feb 28 04:26:18 crc kubenswrapper[5072]: I0228 04:26:18.039044 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/openstack-galera-1" podStartSLOduration=8.083019622 podStartE2EDuration="19.039022482s" podCreationTimestamp="2026-02-28 04:25:59 +0000 UTC" firstStartedPulling="2026-02-28 04:26:01.365253552 +0000 UTC m=+983.359983744" lastFinishedPulling="2026-02-28 04:26:12.321256412 +0000 UTC m=+994.315986604" observedRunningTime="2026-02-28 04:26:18.038141374 +0000 UTC m=+1000.032871566" watchObservedRunningTime="2026-02-28 04:26:18.039022482 +0000 UTC m=+1000.033752664" Feb 28 04:26:18 crc kubenswrapper[5072]: I0228 04:26:18.075288 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/openstack-galera-2" podStartSLOduration=8.075351823 podStartE2EDuration="19.075264676s" podCreationTimestamp="2026-02-28 04:25:59 +0000 UTC" firstStartedPulling="2026-02-28 04:26:01.380761708 +0000 UTC m=+983.375491890" lastFinishedPulling="2026-02-28 04:26:12.380674551 +0000 UTC m=+994.375404743" observedRunningTime="2026-02-28 04:26:18.071547369 +0000 UTC m=+1000.066277571" watchObservedRunningTime="2026-02-28 04:26:18.075264676 +0000 UTC m=+1000.069994878" Feb 28 04:26:18 crc kubenswrapper[5072]: I0228 04:26:18.113704 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/openstack-galera-0" podStartSLOduration=7.898911882 podStartE2EDuration="19.113680117s" podCreationTimestamp="2026-02-28 04:25:59 +0000 UTC" firstStartedPulling="2026-02-28 04:26:01.127173724 +0000 UTC m=+983.121903916" lastFinishedPulling="2026-02-28 04:26:12.341941959 +0000 UTC m=+994.336672151" observedRunningTime="2026-02-28 04:26:18.112382056 +0000 UTC m=+1000.107112248" watchObservedRunningTime="2026-02-28 04:26:18.113680117 +0000 UTC m=+1000.108410309" Feb 28 04:26:20 crc kubenswrapper[5072]: I0228 04:26:20.106523 5072 patch_prober.go:28] interesting pod/machine-config-daemon-5lrpf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:26:20 crc kubenswrapper[5072]: I0228 04:26:20.106948 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:26:20 crc kubenswrapper[5072]: I0228 04:26:20.340577 5072 scope.go:117] "RemoveContainer" containerID="8ff26ba1f6c190544340eaba8e2adc86958af681c78d833fb1ede91de0c7b797" Feb 28 04:26:20 crc kubenswrapper[5072]: I0228 04:26:20.855113 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/openstack-galera-0" Feb 28 04:26:20 crc kubenswrapper[5072]: I0228 04:26:20.855455 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="horizon-kuttl-tests/openstack-galera-0" Feb 28 04:26:20 crc kubenswrapper[5072]: I0228 04:26:20.886960 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/openstack-galera-2" Feb 28 04:26:20 crc kubenswrapper[5072]: I0228 04:26:20.887011 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="horizon-kuttl-tests/openstack-galera-2" Feb 28 04:26:20 crc kubenswrapper[5072]: I0228 04:26:20.894495 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="horizon-kuttl-tests/openstack-galera-1" Feb 28 04:26:20 crc kubenswrapper[5072]: I0228 04:26:20.894532 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/openstack-galera-1" Feb 28 04:26:22 crc kubenswrapper[5072]: I0228 04:26:22.602724 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-8c869c9f9-dnsxj" Feb 28 04:26:25 crc kubenswrapper[5072]: I0228 04:26:25.635377 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/memcached-0"] Feb 28 04:26:25 crc kubenswrapper[5072]: E0228 04:26:25.635985 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31fa42ba-a85d-47c3-a663-dd23c6029522" containerName="extract-content" Feb 28 04:26:25 crc kubenswrapper[5072]: I0228 04:26:25.636003 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="31fa42ba-a85d-47c3-a663-dd23c6029522" containerName="extract-content" Feb 28 04:26:25 crc kubenswrapper[5072]: E0228 04:26:25.636018 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bb6cfd5-00f1-4028-b63e-96effbd865f0" containerName="oc" Feb 28 04:26:25 crc kubenswrapper[5072]: I0228 04:26:25.636026 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bb6cfd5-00f1-4028-b63e-96effbd865f0" containerName="oc" Feb 28 04:26:25 crc kubenswrapper[5072]: E0228 04:26:25.636034 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31fa42ba-a85d-47c3-a663-dd23c6029522" containerName="registry-server" Feb 28 04:26:25 crc kubenswrapper[5072]: I0228 04:26:25.636040 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="31fa42ba-a85d-47c3-a663-dd23c6029522" containerName="registry-server" Feb 28 04:26:25 crc kubenswrapper[5072]: E0228 04:26:25.636047 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31fa42ba-a85d-47c3-a663-dd23c6029522" containerName="extract-utilities" Feb 28 04:26:25 crc kubenswrapper[5072]: I0228 04:26:25.636056 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="31fa42ba-a85d-47c3-a663-dd23c6029522" containerName="extract-utilities" Feb 28 04:26:25 crc kubenswrapper[5072]: I0228 04:26:25.636197 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="31fa42ba-a85d-47c3-a663-dd23c6029522" containerName="registry-server" Feb 28 04:26:25 crc kubenswrapper[5072]: I0228 04:26:25.636212 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bb6cfd5-00f1-4028-b63e-96effbd865f0" containerName="oc" Feb 28 04:26:25 crc kubenswrapper[5072]: I0228 04:26:25.636755 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/memcached-0" Feb 28 04:26:25 crc kubenswrapper[5072]: I0228 04:26:25.639732 5072 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"memcached-memcached-dockercfg-8cd5x" Feb 28 04:26:25 crc kubenswrapper[5072]: I0228 04:26:25.641988 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/memcached-0"] Feb 28 04:26:25 crc kubenswrapper[5072]: I0228 04:26:25.642803 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"memcached-config-data" Feb 28 04:26:25 crc kubenswrapper[5072]: I0228 04:26:25.741335 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skxzv\" (UniqueName: \"kubernetes.io/projected/2da690cd-386a-45cf-89a9-4d5a02218af4-kube-api-access-skxzv\") pod \"memcached-0\" (UID: \"2da690cd-386a-45cf-89a9-4d5a02218af4\") " pod="horizon-kuttl-tests/memcached-0" Feb 28 04:26:25 crc kubenswrapper[5072]: I0228 04:26:25.741406 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2da690cd-386a-45cf-89a9-4d5a02218af4-kolla-config\") pod \"memcached-0\" (UID: \"2da690cd-386a-45cf-89a9-4d5a02218af4\") " pod="horizon-kuttl-tests/memcached-0" Feb 28 04:26:25 crc kubenswrapper[5072]: I0228 04:26:25.741708 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2da690cd-386a-45cf-89a9-4d5a02218af4-config-data\") pod \"memcached-0\" (UID: \"2da690cd-386a-45cf-89a9-4d5a02218af4\") " pod="horizon-kuttl-tests/memcached-0" Feb 28 04:26:25 crc kubenswrapper[5072]: I0228 04:26:25.842575 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2da690cd-386a-45cf-89a9-4d5a02218af4-config-data\") pod \"memcached-0\" (UID: \"2da690cd-386a-45cf-89a9-4d5a02218af4\") " pod="horizon-kuttl-tests/memcached-0" Feb 28 04:26:25 crc kubenswrapper[5072]: I0228 04:26:25.842658 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skxzv\" (UniqueName: \"kubernetes.io/projected/2da690cd-386a-45cf-89a9-4d5a02218af4-kube-api-access-skxzv\") pod \"memcached-0\" (UID: \"2da690cd-386a-45cf-89a9-4d5a02218af4\") " pod="horizon-kuttl-tests/memcached-0" Feb 28 04:26:25 crc kubenswrapper[5072]: I0228 04:26:25.842693 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2da690cd-386a-45cf-89a9-4d5a02218af4-kolla-config\") pod \"memcached-0\" (UID: \"2da690cd-386a-45cf-89a9-4d5a02218af4\") " pod="horizon-kuttl-tests/memcached-0" Feb 28 04:26:25 crc kubenswrapper[5072]: I0228 04:26:25.843667 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2da690cd-386a-45cf-89a9-4d5a02218af4-kolla-config\") pod \"memcached-0\" (UID: \"2da690cd-386a-45cf-89a9-4d5a02218af4\") " pod="horizon-kuttl-tests/memcached-0" Feb 28 04:26:25 crc kubenswrapper[5072]: I0228 04:26:25.843682 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2da690cd-386a-45cf-89a9-4d5a02218af4-config-data\") pod \"memcached-0\" (UID: \"2da690cd-386a-45cf-89a9-4d5a02218af4\") " pod="horizon-kuttl-tests/memcached-0" Feb 28 04:26:25 crc kubenswrapper[5072]: I0228 04:26:25.864814 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skxzv\" (UniqueName: \"kubernetes.io/projected/2da690cd-386a-45cf-89a9-4d5a02218af4-kube-api-access-skxzv\") pod \"memcached-0\" (UID: \"2da690cd-386a-45cf-89a9-4d5a02218af4\") " pod="horizon-kuttl-tests/memcached-0" Feb 28 04:26:25 crc kubenswrapper[5072]: I0228 04:26:25.951185 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/memcached-0" Feb 28 04:26:26 crc kubenswrapper[5072]: I0228 04:26:26.349150 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/memcached-0"] Feb 28 04:26:26 crc kubenswrapper[5072]: I0228 04:26:26.985376 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="horizon-kuttl-tests/openstack-galera-2" Feb 28 04:26:27 crc kubenswrapper[5072]: I0228 04:26:27.075805 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/memcached-0" event={"ID":"2da690cd-386a-45cf-89a9-4d5a02218af4","Type":"ContainerStarted","Data":"61edea0b66d9fcfa1633804127dd188965b4668825cae119e984bf6fb14b00a3"} Feb 28 04:26:27 crc kubenswrapper[5072]: I0228 04:26:27.079091 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="horizon-kuttl-tests/openstack-galera-2" Feb 28 04:26:27 crc kubenswrapper[5072]: E0228 04:26:27.548956 5072 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.5:33934->38.102.83.5:41001: write tcp 38.102.83.5:33934->38.102.83.5:41001: write: broken pipe Feb 28 04:26:28 crc kubenswrapper[5072]: I0228 04:26:28.939401 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-9g9sv"] Feb 28 04:26:28 crc kubenswrapper[5072]: I0228 04:26:28.940527 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-9g9sv" Feb 28 04:26:28 crc kubenswrapper[5072]: I0228 04:26:28.942938 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-pqt8c" Feb 28 04:26:28 crc kubenswrapper[5072]: I0228 04:26:28.948218 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-9g9sv"] Feb 28 04:26:28 crc kubenswrapper[5072]: I0228 04:26:28.991266 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7bcf\" (UniqueName: \"kubernetes.io/projected/c35bd0a7-4cba-4185-a45c-bfaf82c04638-kube-api-access-m7bcf\") pod \"rabbitmq-cluster-operator-index-9g9sv\" (UID: \"c35bd0a7-4cba-4185-a45c-bfaf82c04638\") " pod="openstack-operators/rabbitmq-cluster-operator-index-9g9sv" Feb 28 04:26:29 crc kubenswrapper[5072]: I0228 04:26:29.089685 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/memcached-0" event={"ID":"2da690cd-386a-45cf-89a9-4d5a02218af4","Type":"ContainerStarted","Data":"bf5ad711c4005244241e7c346d2c1a7ef5e3506d23aa2bcbcf1e62402a2e81f5"} Feb 28 04:26:29 crc kubenswrapper[5072]: I0228 04:26:29.089765 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/memcached-0" Feb 28 04:26:29 crc kubenswrapper[5072]: I0228 04:26:29.092474 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7bcf\" (UniqueName: \"kubernetes.io/projected/c35bd0a7-4cba-4185-a45c-bfaf82c04638-kube-api-access-m7bcf\") pod \"rabbitmq-cluster-operator-index-9g9sv\" (UID: \"c35bd0a7-4cba-4185-a45c-bfaf82c04638\") " pod="openstack-operators/rabbitmq-cluster-operator-index-9g9sv" Feb 28 04:26:29 crc kubenswrapper[5072]: I0228 04:26:29.113979 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7bcf\" (UniqueName: \"kubernetes.io/projected/c35bd0a7-4cba-4185-a45c-bfaf82c04638-kube-api-access-m7bcf\") pod \"rabbitmq-cluster-operator-index-9g9sv\" (UID: \"c35bd0a7-4cba-4185-a45c-bfaf82c04638\") " pod="openstack-operators/rabbitmq-cluster-operator-index-9g9sv" Feb 28 04:26:29 crc kubenswrapper[5072]: I0228 04:26:29.117430 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/memcached-0" podStartSLOduration=1.6066996759999999 podStartE2EDuration="4.117402388s" podCreationTimestamp="2026-02-28 04:26:25 +0000 UTC" firstStartedPulling="2026-02-28 04:26:26.353906008 +0000 UTC m=+1008.348636200" lastFinishedPulling="2026-02-28 04:26:28.86460871 +0000 UTC m=+1010.859338912" observedRunningTime="2026-02-28 04:26:29.110927746 +0000 UTC m=+1011.105657938" watchObservedRunningTime="2026-02-28 04:26:29.117402388 +0000 UTC m=+1011.112132580" Feb 28 04:26:29 crc kubenswrapper[5072]: I0228 04:26:29.255781 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-9g9sv" Feb 28 04:26:29 crc kubenswrapper[5072]: I0228 04:26:29.652401 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/root-account-create-update-kjjsp"] Feb 28 04:26:29 crc kubenswrapper[5072]: I0228 04:26:29.653803 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/root-account-create-update-kjjsp" Feb 28 04:26:29 crc kubenswrapper[5072]: I0228 04:26:29.655392 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/root-account-create-update-kjjsp"] Feb 28 04:26:29 crc kubenswrapper[5072]: I0228 04:26:29.669775 5072 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"openstack-mariadb-root-db-secret" Feb 28 04:26:29 crc kubenswrapper[5072]: I0228 04:26:29.710051 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e775589-8da9-4347-925a-3ef4b5fb7e28-operator-scripts\") pod \"root-account-create-update-kjjsp\" (UID: \"3e775589-8da9-4347-925a-3ef4b5fb7e28\") " pod="horizon-kuttl-tests/root-account-create-update-kjjsp" Feb 28 04:26:29 crc kubenswrapper[5072]: I0228 04:26:29.710112 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntnh6\" (UniqueName: \"kubernetes.io/projected/3e775589-8da9-4347-925a-3ef4b5fb7e28-kube-api-access-ntnh6\") pod \"root-account-create-update-kjjsp\" (UID: \"3e775589-8da9-4347-925a-3ef4b5fb7e28\") " pod="horizon-kuttl-tests/root-account-create-update-kjjsp" Feb 28 04:26:29 crc kubenswrapper[5072]: I0228 04:26:29.793195 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-9g9sv"] Feb 28 04:26:29 crc kubenswrapper[5072]: I0228 04:26:29.814337 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e775589-8da9-4347-925a-3ef4b5fb7e28-operator-scripts\") pod \"root-account-create-update-kjjsp\" (UID: \"3e775589-8da9-4347-925a-3ef4b5fb7e28\") " pod="horizon-kuttl-tests/root-account-create-update-kjjsp" Feb 28 04:26:29 crc kubenswrapper[5072]: I0228 04:26:29.814445 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntnh6\" (UniqueName: \"kubernetes.io/projected/3e775589-8da9-4347-925a-3ef4b5fb7e28-kube-api-access-ntnh6\") pod \"root-account-create-update-kjjsp\" (UID: \"3e775589-8da9-4347-925a-3ef4b5fb7e28\") " pod="horizon-kuttl-tests/root-account-create-update-kjjsp" Feb 28 04:26:29 crc kubenswrapper[5072]: I0228 04:26:29.815874 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e775589-8da9-4347-925a-3ef4b5fb7e28-operator-scripts\") pod \"root-account-create-update-kjjsp\" (UID: \"3e775589-8da9-4347-925a-3ef4b5fb7e28\") " pod="horizon-kuttl-tests/root-account-create-update-kjjsp" Feb 28 04:26:29 crc kubenswrapper[5072]: I0228 04:26:29.856425 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntnh6\" (UniqueName: \"kubernetes.io/projected/3e775589-8da9-4347-925a-3ef4b5fb7e28-kube-api-access-ntnh6\") pod \"root-account-create-update-kjjsp\" (UID: \"3e775589-8da9-4347-925a-3ef4b5fb7e28\") " pod="horizon-kuttl-tests/root-account-create-update-kjjsp" Feb 28 04:26:30 crc kubenswrapper[5072]: I0228 04:26:30.005388 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/root-account-create-update-kjjsp" Feb 28 04:26:30 crc kubenswrapper[5072]: I0228 04:26:30.107836 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-9g9sv" event={"ID":"c35bd0a7-4cba-4185-a45c-bfaf82c04638","Type":"ContainerStarted","Data":"bcdcba18b6737963fceb481c95cf1c03046dc7d97b3e63fd3f7505c33f6a9d52"} Feb 28 04:26:30 crc kubenswrapper[5072]: I0228 04:26:30.512265 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/root-account-create-update-kjjsp"] Feb 28 04:26:30 crc kubenswrapper[5072]: W0228 04:26:30.515397 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e775589_8da9_4347_925a_3ef4b5fb7e28.slice/crio-e4dafa3e5efadec99d78fccff548a07ff6fa4802d3150de34c20c07980a1526b WatchSource:0}: Error finding container e4dafa3e5efadec99d78fccff548a07ff6fa4802d3150de34c20c07980a1526b: Status 404 returned error can't find the container with id e4dafa3e5efadec99d78fccff548a07ff6fa4802d3150de34c20c07980a1526b Feb 28 04:26:31 crc kubenswrapper[5072]: I0228 04:26:31.011111 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/openstack-galera-2" podUID="9c25f535-2cfb-40b6-9412-9888a0fc1975" containerName="galera" probeResult="failure" output=< Feb 28 04:26:31 crc kubenswrapper[5072]: wsrep_local_state_comment (Donor/Desynced) differs from Synced Feb 28 04:26:31 crc kubenswrapper[5072]: > Feb 28 04:26:31 crc kubenswrapper[5072]: I0228 04:26:31.114086 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/root-account-create-update-kjjsp" event={"ID":"3e775589-8da9-4347-925a-3ef4b5fb7e28","Type":"ContainerStarted","Data":"e4dafa3e5efadec99d78fccff548a07ff6fa4802d3150de34c20c07980a1526b"} Feb 28 04:26:34 crc kubenswrapper[5072]: I0228 04:26:34.133446 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/root-account-create-update-kjjsp" event={"ID":"3e775589-8da9-4347-925a-3ef4b5fb7e28","Type":"ContainerStarted","Data":"016c30a85ac29fe47d215edf0d46c749f707c35ed18e7c3c68a3b8c3448b15d1"} Feb 28 04:26:34 crc kubenswrapper[5072]: I0228 04:26:34.153418 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/root-account-create-update-kjjsp" podStartSLOduration=5.15339887 podStartE2EDuration="5.15339887s" podCreationTimestamp="2026-02-28 04:26:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:26:34.149150917 +0000 UTC m=+1016.143881109" watchObservedRunningTime="2026-02-28 04:26:34.15339887 +0000 UTC m=+1016.148129062" Feb 28 04:26:35 crc kubenswrapper[5072]: I0228 04:26:35.953160 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="horizon-kuttl-tests/memcached-0" Feb 28 04:26:36 crc kubenswrapper[5072]: I0228 04:26:36.148030 5072 generic.go:334] "Generic (PLEG): container finished" podID="3e775589-8da9-4347-925a-3ef4b5fb7e28" containerID="016c30a85ac29fe47d215edf0d46c749f707c35ed18e7c3c68a3b8c3448b15d1" exitCode=0 Feb 28 04:26:36 crc kubenswrapper[5072]: I0228 04:26:36.148072 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/root-account-create-update-kjjsp" event={"ID":"3e775589-8da9-4347-925a-3ef4b5fb7e28","Type":"ContainerDied","Data":"016c30a85ac29fe47d215edf0d46c749f707c35ed18e7c3c68a3b8c3448b15d1"} Feb 28 04:26:37 crc kubenswrapper[5072]: I0228 04:26:37.574902 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/root-account-create-update-kjjsp" Feb 28 04:26:37 crc kubenswrapper[5072]: I0228 04:26:37.648404 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntnh6\" (UniqueName: \"kubernetes.io/projected/3e775589-8da9-4347-925a-3ef4b5fb7e28-kube-api-access-ntnh6\") pod \"3e775589-8da9-4347-925a-3ef4b5fb7e28\" (UID: \"3e775589-8da9-4347-925a-3ef4b5fb7e28\") " Feb 28 04:26:37 crc kubenswrapper[5072]: I0228 04:26:37.648472 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e775589-8da9-4347-925a-3ef4b5fb7e28-operator-scripts\") pod \"3e775589-8da9-4347-925a-3ef4b5fb7e28\" (UID: \"3e775589-8da9-4347-925a-3ef4b5fb7e28\") " Feb 28 04:26:37 crc kubenswrapper[5072]: I0228 04:26:37.649176 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e775589-8da9-4347-925a-3ef4b5fb7e28-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3e775589-8da9-4347-925a-3ef4b5fb7e28" (UID: "3e775589-8da9-4347-925a-3ef4b5fb7e28"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:26:37 crc kubenswrapper[5072]: I0228 04:26:37.653462 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e775589-8da9-4347-925a-3ef4b5fb7e28-kube-api-access-ntnh6" (OuterVolumeSpecName: "kube-api-access-ntnh6") pod "3e775589-8da9-4347-925a-3ef4b5fb7e28" (UID: "3e775589-8da9-4347-925a-3ef4b5fb7e28"). InnerVolumeSpecName "kube-api-access-ntnh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:26:37 crc kubenswrapper[5072]: I0228 04:26:37.750369 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntnh6\" (UniqueName: \"kubernetes.io/projected/3e775589-8da9-4347-925a-3ef4b5fb7e28-kube-api-access-ntnh6\") on node \"crc\" DevicePath \"\"" Feb 28 04:26:37 crc kubenswrapper[5072]: I0228 04:26:37.750416 5072 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e775589-8da9-4347-925a-3ef4b5fb7e28-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 04:26:38 crc kubenswrapper[5072]: I0228 04:26:38.160704 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/root-account-create-update-kjjsp" event={"ID":"3e775589-8da9-4347-925a-3ef4b5fb7e28","Type":"ContainerDied","Data":"e4dafa3e5efadec99d78fccff548a07ff6fa4802d3150de34c20c07980a1526b"} Feb 28 04:26:38 crc kubenswrapper[5072]: I0228 04:26:38.160731 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/root-account-create-update-kjjsp" Feb 28 04:26:38 crc kubenswrapper[5072]: I0228 04:26:38.160744 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4dafa3e5efadec99d78fccff548a07ff6fa4802d3150de34c20c07980a1526b" Feb 28 04:26:38 crc kubenswrapper[5072]: I0228 04:26:38.162909 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-9g9sv" event={"ID":"c35bd0a7-4cba-4185-a45c-bfaf82c04638","Type":"ContainerStarted","Data":"b90943e538e0b47e4b23feddf1a30bf60bffbce8de498186a40175118f72a618"} Feb 28 04:26:38 crc kubenswrapper[5072]: I0228 04:26:38.184478 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-9g9sv" podStartSLOduration=2.323054165 podStartE2EDuration="10.184455935s" podCreationTimestamp="2026-02-28 04:26:28 +0000 UTC" firstStartedPulling="2026-02-28 04:26:29.81912585 +0000 UTC m=+1011.813856042" lastFinishedPulling="2026-02-28 04:26:37.68052762 +0000 UTC m=+1019.675257812" observedRunningTime="2026-02-28 04:26:38.176106723 +0000 UTC m=+1020.170836945" watchObservedRunningTime="2026-02-28 04:26:38.184455935 +0000 UTC m=+1020.179186127" Feb 28 04:26:38 crc kubenswrapper[5072]: I0228 04:26:38.718231 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="horizon-kuttl-tests/openstack-galera-1" Feb 28 04:26:38 crc kubenswrapper[5072]: I0228 04:26:38.779250 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="horizon-kuttl-tests/openstack-galera-1" Feb 28 04:26:39 crc kubenswrapper[5072]: I0228 04:26:39.257509 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-9g9sv" Feb 28 04:26:39 crc kubenswrapper[5072]: I0228 04:26:39.257560 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-9g9sv" Feb 28 04:26:39 crc kubenswrapper[5072]: I0228 04:26:39.278491 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-9g9sv" Feb 28 04:26:41 crc kubenswrapper[5072]: I0228 04:26:41.548760 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="horizon-kuttl-tests/openstack-galera-0" Feb 28 04:26:41 crc kubenswrapper[5072]: I0228 04:26:41.643684 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="horizon-kuttl-tests/openstack-galera-0" Feb 28 04:26:49 crc kubenswrapper[5072]: I0228 04:26:49.282521 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-9g9sv" Feb 28 04:26:50 crc kubenswrapper[5072]: I0228 04:26:50.105731 5072 patch_prober.go:28] interesting pod/machine-config-daemon-5lrpf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:26:50 crc kubenswrapper[5072]: I0228 04:26:50.106078 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:27:00 crc kubenswrapper[5072]: I0228 04:27:00.567204 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj"] Feb 28 04:27:00 crc kubenswrapper[5072]: E0228 04:27:00.568227 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e775589-8da9-4347-925a-3ef4b5fb7e28" containerName="mariadb-account-create-update" Feb 28 04:27:00 crc kubenswrapper[5072]: I0228 04:27:00.568242 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e775589-8da9-4347-925a-3ef4b5fb7e28" containerName="mariadb-account-create-update" Feb 28 04:27:00 crc kubenswrapper[5072]: I0228 04:27:00.568341 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e775589-8da9-4347-925a-3ef4b5fb7e28" containerName="mariadb-account-create-update" Feb 28 04:27:00 crc kubenswrapper[5072]: I0228 04:27:00.569352 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj" Feb 28 04:27:00 crc kubenswrapper[5072]: I0228 04:27:00.571618 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-56tr7" Feb 28 04:27:00 crc kubenswrapper[5072]: I0228 04:27:00.578806 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj"] Feb 28 04:27:00 crc kubenswrapper[5072]: I0228 04:27:00.702718 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f246c68d-2d24-48f5-9e70-7286730298f3-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj\" (UID: \"f246c68d-2d24-48f5-9e70-7286730298f3\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj" Feb 28 04:27:00 crc kubenswrapper[5072]: I0228 04:27:00.703189 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdwsh\" (UniqueName: \"kubernetes.io/projected/f246c68d-2d24-48f5-9e70-7286730298f3-kube-api-access-mdwsh\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj\" (UID: \"f246c68d-2d24-48f5-9e70-7286730298f3\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj" Feb 28 04:27:00 crc kubenswrapper[5072]: I0228 04:27:00.703226 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f246c68d-2d24-48f5-9e70-7286730298f3-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj\" (UID: \"f246c68d-2d24-48f5-9e70-7286730298f3\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj" Feb 28 04:27:00 crc kubenswrapper[5072]: I0228 04:27:00.804522 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f246c68d-2d24-48f5-9e70-7286730298f3-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj\" (UID: \"f246c68d-2d24-48f5-9e70-7286730298f3\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj" Feb 28 04:27:00 crc kubenswrapper[5072]: I0228 04:27:00.804599 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f246c68d-2d24-48f5-9e70-7286730298f3-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj\" (UID: \"f246c68d-2d24-48f5-9e70-7286730298f3\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj" Feb 28 04:27:00 crc kubenswrapper[5072]: I0228 04:27:00.804680 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdwsh\" (UniqueName: \"kubernetes.io/projected/f246c68d-2d24-48f5-9e70-7286730298f3-kube-api-access-mdwsh\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj\" (UID: \"f246c68d-2d24-48f5-9e70-7286730298f3\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj" Feb 28 04:27:00 crc kubenswrapper[5072]: I0228 04:27:00.805155 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f246c68d-2d24-48f5-9e70-7286730298f3-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj\" (UID: \"f246c68d-2d24-48f5-9e70-7286730298f3\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj" Feb 28 04:27:00 crc kubenswrapper[5072]: I0228 04:27:00.805157 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f246c68d-2d24-48f5-9e70-7286730298f3-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj\" (UID: \"f246c68d-2d24-48f5-9e70-7286730298f3\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj" Feb 28 04:27:00 crc kubenswrapper[5072]: I0228 04:27:00.828996 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdwsh\" (UniqueName: \"kubernetes.io/projected/f246c68d-2d24-48f5-9e70-7286730298f3-kube-api-access-mdwsh\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj\" (UID: \"f246c68d-2d24-48f5-9e70-7286730298f3\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj" Feb 28 04:27:00 crc kubenswrapper[5072]: I0228 04:27:00.895939 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj" Feb 28 04:27:01 crc kubenswrapper[5072]: I0228 04:27:01.323252 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj"] Feb 28 04:27:02 crc kubenswrapper[5072]: I0228 04:27:02.303307 5072 generic.go:334] "Generic (PLEG): container finished" podID="f246c68d-2d24-48f5-9e70-7286730298f3" containerID="33ccc774ab28074e17fa9ba2a1ebd3191889cd797043e7b717faefdce0f44596" exitCode=0 Feb 28 04:27:02 crc kubenswrapper[5072]: I0228 04:27:02.303438 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj" event={"ID":"f246c68d-2d24-48f5-9e70-7286730298f3","Type":"ContainerDied","Data":"33ccc774ab28074e17fa9ba2a1ebd3191889cd797043e7b717faefdce0f44596"} Feb 28 04:27:02 crc kubenswrapper[5072]: I0228 04:27:02.303786 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj" event={"ID":"f246c68d-2d24-48f5-9e70-7286730298f3","Type":"ContainerStarted","Data":"4e986d444d2b88fca13cca559a2955c982f8c3b9adf73e52a43d3152e33ec1e2"} Feb 28 04:27:04 crc kubenswrapper[5072]: I0228 04:27:04.316258 5072 generic.go:334] "Generic (PLEG): container finished" podID="f246c68d-2d24-48f5-9e70-7286730298f3" containerID="97375b69562f960915f9036bf18eae4be0c446cda5fc0681419c1666f09be26a" exitCode=0 Feb 28 04:27:04 crc kubenswrapper[5072]: I0228 04:27:04.316290 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj" event={"ID":"f246c68d-2d24-48f5-9e70-7286730298f3","Type":"ContainerDied","Data":"97375b69562f960915f9036bf18eae4be0c446cda5fc0681419c1666f09be26a"} Feb 28 04:27:05 crc kubenswrapper[5072]: I0228 04:27:05.324236 5072 generic.go:334] "Generic (PLEG): container finished" podID="f246c68d-2d24-48f5-9e70-7286730298f3" containerID="1f526d6333b068a7c3555bfc1959efaf818ea2668e28d86fd37e3e1bd8a6abb0" exitCode=0 Feb 28 04:27:05 crc kubenswrapper[5072]: I0228 04:27:05.324331 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj" event={"ID":"f246c68d-2d24-48f5-9e70-7286730298f3","Type":"ContainerDied","Data":"1f526d6333b068a7c3555bfc1959efaf818ea2668e28d86fd37e3e1bd8a6abb0"} Feb 28 04:27:06 crc kubenswrapper[5072]: I0228 04:27:06.673519 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj" Feb 28 04:27:06 crc kubenswrapper[5072]: I0228 04:27:06.785697 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdwsh\" (UniqueName: \"kubernetes.io/projected/f246c68d-2d24-48f5-9e70-7286730298f3-kube-api-access-mdwsh\") pod \"f246c68d-2d24-48f5-9e70-7286730298f3\" (UID: \"f246c68d-2d24-48f5-9e70-7286730298f3\") " Feb 28 04:27:06 crc kubenswrapper[5072]: I0228 04:27:06.785785 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f246c68d-2d24-48f5-9e70-7286730298f3-util\") pod \"f246c68d-2d24-48f5-9e70-7286730298f3\" (UID: \"f246c68d-2d24-48f5-9e70-7286730298f3\") " Feb 28 04:27:06 crc kubenswrapper[5072]: I0228 04:27:06.785856 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f246c68d-2d24-48f5-9e70-7286730298f3-bundle\") pod \"f246c68d-2d24-48f5-9e70-7286730298f3\" (UID: \"f246c68d-2d24-48f5-9e70-7286730298f3\") " Feb 28 04:27:06 crc kubenswrapper[5072]: I0228 04:27:06.786463 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f246c68d-2d24-48f5-9e70-7286730298f3-bundle" (OuterVolumeSpecName: "bundle") pod "f246c68d-2d24-48f5-9e70-7286730298f3" (UID: "f246c68d-2d24-48f5-9e70-7286730298f3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:27:06 crc kubenswrapper[5072]: I0228 04:27:06.790800 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f246c68d-2d24-48f5-9e70-7286730298f3-kube-api-access-mdwsh" (OuterVolumeSpecName: "kube-api-access-mdwsh") pod "f246c68d-2d24-48f5-9e70-7286730298f3" (UID: "f246c68d-2d24-48f5-9e70-7286730298f3"). InnerVolumeSpecName "kube-api-access-mdwsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:27:06 crc kubenswrapper[5072]: I0228 04:27:06.801094 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f246c68d-2d24-48f5-9e70-7286730298f3-util" (OuterVolumeSpecName: "util") pod "f246c68d-2d24-48f5-9e70-7286730298f3" (UID: "f246c68d-2d24-48f5-9e70-7286730298f3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:27:06 crc kubenswrapper[5072]: I0228 04:27:06.888063 5072 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f246c68d-2d24-48f5-9e70-7286730298f3-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 04:27:06 crc kubenswrapper[5072]: I0228 04:27:06.888111 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdwsh\" (UniqueName: \"kubernetes.io/projected/f246c68d-2d24-48f5-9e70-7286730298f3-kube-api-access-mdwsh\") on node \"crc\" DevicePath \"\"" Feb 28 04:27:06 crc kubenswrapper[5072]: I0228 04:27:06.888127 5072 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f246c68d-2d24-48f5-9e70-7286730298f3-util\") on node \"crc\" DevicePath \"\"" Feb 28 04:27:07 crc kubenswrapper[5072]: I0228 04:27:07.340865 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj" event={"ID":"f246c68d-2d24-48f5-9e70-7286730298f3","Type":"ContainerDied","Data":"4e986d444d2b88fca13cca559a2955c982f8c3b9adf73e52a43d3152e33ec1e2"} Feb 28 04:27:07 crc kubenswrapper[5072]: I0228 04:27:07.340903 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e986d444d2b88fca13cca559a2955c982f8c3b9adf73e52a43d3152e33ec1e2" Feb 28 04:27:07 crc kubenswrapper[5072]: I0228 04:27:07.340917 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj" Feb 28 04:27:18 crc kubenswrapper[5072]: I0228 04:27:18.444716 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-bt7bg"] Feb 28 04:27:18 crc kubenswrapper[5072]: E0228 04:27:18.446680 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f246c68d-2d24-48f5-9e70-7286730298f3" containerName="util" Feb 28 04:27:18 crc kubenswrapper[5072]: I0228 04:27:18.446797 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="f246c68d-2d24-48f5-9e70-7286730298f3" containerName="util" Feb 28 04:27:18 crc kubenswrapper[5072]: E0228 04:27:18.446885 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f246c68d-2d24-48f5-9e70-7286730298f3" containerName="extract" Feb 28 04:27:18 crc kubenswrapper[5072]: I0228 04:27:18.446962 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="f246c68d-2d24-48f5-9e70-7286730298f3" containerName="extract" Feb 28 04:27:18 crc kubenswrapper[5072]: E0228 04:27:18.447050 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f246c68d-2d24-48f5-9e70-7286730298f3" containerName="pull" Feb 28 04:27:18 crc kubenswrapper[5072]: I0228 04:27:18.447123 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="f246c68d-2d24-48f5-9e70-7286730298f3" containerName="pull" Feb 28 04:27:18 crc kubenswrapper[5072]: I0228 04:27:18.447345 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="f246c68d-2d24-48f5-9e70-7286730298f3" containerName="extract" Feb 28 04:27:18 crc kubenswrapper[5072]: I0228 04:27:18.448023 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-bt7bg" Feb 28 04:27:18 crc kubenswrapper[5072]: I0228 04:27:18.450694 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-tkxgh" Feb 28 04:27:18 crc kubenswrapper[5072]: I0228 04:27:18.456871 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-bt7bg"] Feb 28 04:27:18 crc kubenswrapper[5072]: I0228 04:27:18.543271 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhzml\" (UniqueName: \"kubernetes.io/projected/373e4c12-ee6c-4f89-b684-fb8e61d18c9f-kube-api-access-vhzml\") pod \"rabbitmq-cluster-operator-779fc9694b-bt7bg\" (UID: \"373e4c12-ee6c-4f89-b684-fb8e61d18c9f\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-bt7bg" Feb 28 04:27:18 crc kubenswrapper[5072]: I0228 04:27:18.645320 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhzml\" (UniqueName: \"kubernetes.io/projected/373e4c12-ee6c-4f89-b684-fb8e61d18c9f-kube-api-access-vhzml\") pod \"rabbitmq-cluster-operator-779fc9694b-bt7bg\" (UID: \"373e4c12-ee6c-4f89-b684-fb8e61d18c9f\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-bt7bg" Feb 28 04:27:18 crc kubenswrapper[5072]: I0228 04:27:18.671945 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhzml\" (UniqueName: \"kubernetes.io/projected/373e4c12-ee6c-4f89-b684-fb8e61d18c9f-kube-api-access-vhzml\") pod \"rabbitmq-cluster-operator-779fc9694b-bt7bg\" (UID: \"373e4c12-ee6c-4f89-b684-fb8e61d18c9f\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-bt7bg" Feb 28 04:27:18 crc kubenswrapper[5072]: I0228 04:27:18.807177 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-bt7bg" Feb 28 04:27:19 crc kubenswrapper[5072]: I0228 04:27:19.197959 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-bt7bg"] Feb 28 04:27:19 crc kubenswrapper[5072]: I0228 04:27:19.407147 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-bt7bg" event={"ID":"373e4c12-ee6c-4f89-b684-fb8e61d18c9f","Type":"ContainerStarted","Data":"9cc7788cfbbf39ed7e3f8db499050cb547b13be135eaeaa59e64e0b95df25532"} Feb 28 04:27:20 crc kubenswrapper[5072]: I0228 04:27:20.105503 5072 patch_prober.go:28] interesting pod/machine-config-daemon-5lrpf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:27:20 crc kubenswrapper[5072]: I0228 04:27:20.105575 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:27:20 crc kubenswrapper[5072]: I0228 04:27:20.105628 5072 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" Feb 28 04:27:20 crc kubenswrapper[5072]: I0228 04:27:20.106321 5072 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e53c192baa0cf41417fc28e90ae7b328b499a54a241b5398391870c675f33023"} pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 28 04:27:20 crc kubenswrapper[5072]: I0228 04:27:20.106371 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" containerID="cri-o://e53c192baa0cf41417fc28e90ae7b328b499a54a241b5398391870c675f33023" gracePeriod=600 Feb 28 04:27:20 crc kubenswrapper[5072]: I0228 04:27:20.422173 5072 generic.go:334] "Generic (PLEG): container finished" podID="a035bbab-1d8f-4120-aaf7-88984d936939" containerID="e53c192baa0cf41417fc28e90ae7b328b499a54a241b5398391870c675f33023" exitCode=0 Feb 28 04:27:20 crc kubenswrapper[5072]: I0228 04:27:20.422217 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" event={"ID":"a035bbab-1d8f-4120-aaf7-88984d936939","Type":"ContainerDied","Data":"e53c192baa0cf41417fc28e90ae7b328b499a54a241b5398391870c675f33023"} Feb 28 04:27:20 crc kubenswrapper[5072]: I0228 04:27:20.422244 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" event={"ID":"a035bbab-1d8f-4120-aaf7-88984d936939","Type":"ContainerStarted","Data":"12b4b3f484e46c0cfc12fc90f1da58cb1b716b35bd291d441c02d1fe8abc9e04"} Feb 28 04:27:20 crc kubenswrapper[5072]: I0228 04:27:20.422259 5072 scope.go:117] "RemoveContainer" containerID="a7da6d10ce5d74918d539d5f69d6835b46ff28621ce44b337a029f6864cad079" Feb 28 04:27:23 crc kubenswrapper[5072]: I0228 04:27:23.443679 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-bt7bg" event={"ID":"373e4c12-ee6c-4f89-b684-fb8e61d18c9f","Type":"ContainerStarted","Data":"00754db9a535f2838f78ab7b42d9c0c94fb32d10f69d5fcf7088dac9088405ef"} Feb 28 04:27:23 crc kubenswrapper[5072]: I0228 04:27:23.460386 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-bt7bg" podStartSLOduration=2.325670835 podStartE2EDuration="5.460361008s" podCreationTimestamp="2026-02-28 04:27:18 +0000 UTC" firstStartedPulling="2026-02-28 04:27:19.207082043 +0000 UTC m=+1061.201812235" lastFinishedPulling="2026-02-28 04:27:22.341772206 +0000 UTC m=+1064.336502408" observedRunningTime="2026-02-28 04:27:23.458136189 +0000 UTC m=+1065.452866391" watchObservedRunningTime="2026-02-28 04:27:23.460361008 +0000 UTC m=+1065.455091200" Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.706211 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/rabbitmq-server-0"] Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.714024 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/rabbitmq-server-0" Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.717770 5072 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"rabbitmq-erlang-cookie" Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.717823 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/rabbitmq-server-0"] Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.718083 5072 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"rabbitmq-default-user" Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.718517 5072 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"rabbitmq-server-dockercfg-nxkgk" Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.718676 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"rabbitmq-plugins-conf" Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.718852 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"rabbitmq-server-conf" Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.795516 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-72915375-6889-4787-a67e-5a149afe4680\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72915375-6889-4787-a67e-5a149afe4680\") pod \"rabbitmq-server-0\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.795602 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bdfb\" (UniqueName: \"kubernetes.io/projected/c71c158a-9876-4f8e-9100-7c0a36834415-kube-api-access-8bdfb\") pod \"rabbitmq-server-0\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.795733 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c71c158a-9876-4f8e-9100-7c0a36834415-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.795839 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c71c158a-9876-4f8e-9100-7c0a36834415-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.795969 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c71c158a-9876-4f8e-9100-7c0a36834415-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.797557 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c71c158a-9876-4f8e-9100-7c0a36834415-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.797751 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c71c158a-9876-4f8e-9100-7c0a36834415-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.797778 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c71c158a-9876-4f8e-9100-7c0a36834415-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.899273 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c71c158a-9876-4f8e-9100-7c0a36834415-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.899581 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c71c158a-9876-4f8e-9100-7c0a36834415-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.899828 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-72915375-6889-4787-a67e-5a149afe4680\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72915375-6889-4787-a67e-5a149afe4680\") pod \"rabbitmq-server-0\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.899865 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bdfb\" (UniqueName: \"kubernetes.io/projected/c71c158a-9876-4f8e-9100-7c0a36834415-kube-api-access-8bdfb\") pod \"rabbitmq-server-0\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.899904 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c71c158a-9876-4f8e-9100-7c0a36834415-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.899950 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c71c158a-9876-4f8e-9100-7c0a36834415-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.900001 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c71c158a-9876-4f8e-9100-7c0a36834415-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.900074 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c71c158a-9876-4f8e-9100-7c0a36834415-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.900385 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c71c158a-9876-4f8e-9100-7c0a36834415-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.900522 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c71c158a-9876-4f8e-9100-7c0a36834415-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.901149 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c71c158a-9876-4f8e-9100-7c0a36834415-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.905594 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c71c158a-9876-4f8e-9100-7c0a36834415-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.905704 5072 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.905736 5072 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-72915375-6889-4787-a67e-5a149afe4680\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72915375-6889-4787-a67e-5a149afe4680\") pod \"rabbitmq-server-0\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/506da916c43778d76942a0a7bcd47935c2d949abeeb50bf4a16bcbd6ebfad83a/globalmount\"" pod="horizon-kuttl-tests/rabbitmq-server-0" Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.914283 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c71c158a-9876-4f8e-9100-7c0a36834415-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.914764 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c71c158a-9876-4f8e-9100-7c0a36834415-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.917409 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bdfb\" (UniqueName: \"kubernetes.io/projected/c71c158a-9876-4f8e-9100-7c0a36834415-kube-api-access-8bdfb\") pod \"rabbitmq-server-0\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Feb 28 04:27:28 crc kubenswrapper[5072]: I0228 04:27:28.929705 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-72915375-6889-4787-a67e-5a149afe4680\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72915375-6889-4787-a67e-5a149afe4680\") pod \"rabbitmq-server-0\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Feb 28 04:27:29 crc kubenswrapper[5072]: I0228 04:27:29.038514 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/rabbitmq-server-0" Feb 28 04:27:29 crc kubenswrapper[5072]: I0228 04:27:29.482143 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/rabbitmq-server-0"] Feb 28 04:27:30 crc kubenswrapper[5072]: I0228 04:27:30.329557 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-crv8s"] Feb 28 04:27:30 crc kubenswrapper[5072]: I0228 04:27:30.330324 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-crv8s" Feb 28 04:27:30 crc kubenswrapper[5072]: I0228 04:27:30.332089 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-65c7x" Feb 28 04:27:30 crc kubenswrapper[5072]: I0228 04:27:30.341513 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-crv8s"] Feb 28 04:27:30 crc kubenswrapper[5072]: I0228 04:27:30.418700 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpckf\" (UniqueName: \"kubernetes.io/projected/f7ec1561-2733-469f-b4b4-13035f2557f0-kube-api-access-mpckf\") pod \"keystone-operator-index-crv8s\" (UID: \"f7ec1561-2733-469f-b4b4-13035f2557f0\") " pod="openstack-operators/keystone-operator-index-crv8s" Feb 28 04:27:30 crc kubenswrapper[5072]: I0228 04:27:30.503272 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/rabbitmq-server-0" event={"ID":"c71c158a-9876-4f8e-9100-7c0a36834415","Type":"ContainerStarted","Data":"269042d0c382cdfb8d4cf6f67f41cc281a02225c549660ad5d64100ede0972ef"} Feb 28 04:27:30 crc kubenswrapper[5072]: I0228 04:27:30.519745 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpckf\" (UniqueName: \"kubernetes.io/projected/f7ec1561-2733-469f-b4b4-13035f2557f0-kube-api-access-mpckf\") pod \"keystone-operator-index-crv8s\" (UID: \"f7ec1561-2733-469f-b4b4-13035f2557f0\") " pod="openstack-operators/keystone-operator-index-crv8s" Feb 28 04:27:30 crc kubenswrapper[5072]: I0228 04:27:30.551876 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpckf\" (UniqueName: \"kubernetes.io/projected/f7ec1561-2733-469f-b4b4-13035f2557f0-kube-api-access-mpckf\") pod \"keystone-operator-index-crv8s\" (UID: \"f7ec1561-2733-469f-b4b4-13035f2557f0\") " pod="openstack-operators/keystone-operator-index-crv8s" Feb 28 04:27:30 crc kubenswrapper[5072]: I0228 04:27:30.653310 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-crv8s" Feb 28 04:27:31 crc kubenswrapper[5072]: I0228 04:27:31.067350 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-crv8s"] Feb 28 04:27:31 crc kubenswrapper[5072]: W0228 04:27:31.075076 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7ec1561_2733_469f_b4b4_13035f2557f0.slice/crio-1e5481e986c0cf83a9d0bf6b39b68cf807559761b304457fcd5575cbdf66157b WatchSource:0}: Error finding container 1e5481e986c0cf83a9d0bf6b39b68cf807559761b304457fcd5575cbdf66157b: Status 404 returned error can't find the container with id 1e5481e986c0cf83a9d0bf6b39b68cf807559761b304457fcd5575cbdf66157b Feb 28 04:27:31 crc kubenswrapper[5072]: I0228 04:27:31.522604 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-crv8s" event={"ID":"f7ec1561-2733-469f-b4b4-13035f2557f0","Type":"ContainerStarted","Data":"1e5481e986c0cf83a9d0bf6b39b68cf807559761b304457fcd5575cbdf66157b"} Feb 28 04:27:41 crc kubenswrapper[5072]: I0228 04:27:41.607759 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-crv8s" event={"ID":"f7ec1561-2733-469f-b4b4-13035f2557f0","Type":"ContainerStarted","Data":"a24df8c3e94264b53b151a7d0b3dc470fd182808407e5bd0c999d4733247cd81"} Feb 28 04:27:41 crc kubenswrapper[5072]: I0228 04:27:41.624516 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-crv8s" podStartSLOduration=1.979573418 podStartE2EDuration="11.624495722s" podCreationTimestamp="2026-02-28 04:27:30 +0000 UTC" firstStartedPulling="2026-02-28 04:27:31.080030066 +0000 UTC m=+1073.074760258" lastFinishedPulling="2026-02-28 04:27:40.72495238 +0000 UTC m=+1082.719682562" observedRunningTime="2026-02-28 04:27:41.619689842 +0000 UTC m=+1083.614420034" watchObservedRunningTime="2026-02-28 04:27:41.624495722 +0000 UTC m=+1083.619225914" Feb 28 04:27:42 crc kubenswrapper[5072]: I0228 04:27:42.615033 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/rabbitmq-server-0" event={"ID":"c71c158a-9876-4f8e-9100-7c0a36834415","Type":"ContainerStarted","Data":"25cd65eea13a4dec7c45316f939fe6c7a1ad58c61a487560531199a80555628e"} Feb 28 04:27:50 crc kubenswrapper[5072]: I0228 04:27:50.654244 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-crv8s" Feb 28 04:27:50 crc kubenswrapper[5072]: I0228 04:27:50.654529 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-crv8s" Feb 28 04:27:50 crc kubenswrapper[5072]: I0228 04:27:50.712265 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-crv8s" Feb 28 04:27:50 crc kubenswrapper[5072]: I0228 04:27:50.755801 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-crv8s" Feb 28 04:27:59 crc kubenswrapper[5072]: I0228 04:27:59.160748 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d"] Feb 28 04:27:59 crc kubenswrapper[5072]: I0228 04:27:59.164813 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d" Feb 28 04:27:59 crc kubenswrapper[5072]: I0228 04:27:59.167571 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-56tr7" Feb 28 04:27:59 crc kubenswrapper[5072]: I0228 04:27:59.169418 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d"] Feb 28 04:27:59 crc kubenswrapper[5072]: I0228 04:27:59.277807 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mhb2\" (UniqueName: \"kubernetes.io/projected/46bb064f-a1bd-40b6-baaf-0ed5f71c926d-kube-api-access-4mhb2\") pod \"8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d\" (UID: \"46bb064f-a1bd-40b6-baaf-0ed5f71c926d\") " pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d" Feb 28 04:27:59 crc kubenswrapper[5072]: I0228 04:27:59.277872 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/46bb064f-a1bd-40b6-baaf-0ed5f71c926d-util\") pod \"8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d\" (UID: \"46bb064f-a1bd-40b6-baaf-0ed5f71c926d\") " pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d" Feb 28 04:27:59 crc kubenswrapper[5072]: I0228 04:27:59.277928 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/46bb064f-a1bd-40b6-baaf-0ed5f71c926d-bundle\") pod \"8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d\" (UID: \"46bb064f-a1bd-40b6-baaf-0ed5f71c926d\") " pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d" Feb 28 04:27:59 crc kubenswrapper[5072]: I0228 04:27:59.378969 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/46bb064f-a1bd-40b6-baaf-0ed5f71c926d-bundle\") pod \"8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d\" (UID: \"46bb064f-a1bd-40b6-baaf-0ed5f71c926d\") " pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d" Feb 28 04:27:59 crc kubenswrapper[5072]: I0228 04:27:59.379071 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mhb2\" (UniqueName: \"kubernetes.io/projected/46bb064f-a1bd-40b6-baaf-0ed5f71c926d-kube-api-access-4mhb2\") pod \"8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d\" (UID: \"46bb064f-a1bd-40b6-baaf-0ed5f71c926d\") " pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d" Feb 28 04:27:59 crc kubenswrapper[5072]: I0228 04:27:59.379106 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/46bb064f-a1bd-40b6-baaf-0ed5f71c926d-util\") pod \"8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d\" (UID: \"46bb064f-a1bd-40b6-baaf-0ed5f71c926d\") " pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d" Feb 28 04:27:59 crc kubenswrapper[5072]: I0228 04:27:59.379570 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/46bb064f-a1bd-40b6-baaf-0ed5f71c926d-util\") pod \"8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d\" (UID: \"46bb064f-a1bd-40b6-baaf-0ed5f71c926d\") " pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d" Feb 28 04:27:59 crc kubenswrapper[5072]: I0228 04:27:59.379585 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/46bb064f-a1bd-40b6-baaf-0ed5f71c926d-bundle\") pod \"8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d\" (UID: \"46bb064f-a1bd-40b6-baaf-0ed5f71c926d\") " pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d" Feb 28 04:27:59 crc kubenswrapper[5072]: I0228 04:27:59.399046 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mhb2\" (UniqueName: \"kubernetes.io/projected/46bb064f-a1bd-40b6-baaf-0ed5f71c926d-kube-api-access-4mhb2\") pod \"8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d\" (UID: \"46bb064f-a1bd-40b6-baaf-0ed5f71c926d\") " pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d" Feb 28 04:27:59 crc kubenswrapper[5072]: I0228 04:27:59.484937 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d" Feb 28 04:27:59 crc kubenswrapper[5072]: I0228 04:27:59.926064 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d"] Feb 28 04:27:59 crc kubenswrapper[5072]: W0228 04:27:59.935944 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46bb064f_a1bd_40b6_baaf_0ed5f71c926d.slice/crio-437c3885cc0b5f8641539beebbff550f4bb23a37b277fcaff3097eb86e388799 WatchSource:0}: Error finding container 437c3885cc0b5f8641539beebbff550f4bb23a37b277fcaff3097eb86e388799: Status 404 returned error can't find the container with id 437c3885cc0b5f8641539beebbff550f4bb23a37b277fcaff3097eb86e388799 Feb 28 04:28:00 crc kubenswrapper[5072]: I0228 04:28:00.128396 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537548-jhcpw"] Feb 28 04:28:00 crc kubenswrapper[5072]: I0228 04:28:00.129460 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537548-jhcpw" Feb 28 04:28:00 crc kubenswrapper[5072]: I0228 04:28:00.136364 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-c8kvx" Feb 28 04:28:00 crc kubenswrapper[5072]: I0228 04:28:00.136613 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 04:28:00 crc kubenswrapper[5072]: I0228 04:28:00.136821 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 04:28:00 crc kubenswrapper[5072]: I0228 04:28:00.146936 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537548-jhcpw"] Feb 28 04:28:00 crc kubenswrapper[5072]: I0228 04:28:00.189248 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7jkc\" (UniqueName: \"kubernetes.io/projected/08a2756a-1558-472d-8e33-f3b8009eadab-kube-api-access-t7jkc\") pod \"auto-csr-approver-29537548-jhcpw\" (UID: \"08a2756a-1558-472d-8e33-f3b8009eadab\") " pod="openshift-infra/auto-csr-approver-29537548-jhcpw" Feb 28 04:28:00 crc kubenswrapper[5072]: I0228 04:28:00.290341 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7jkc\" (UniqueName: \"kubernetes.io/projected/08a2756a-1558-472d-8e33-f3b8009eadab-kube-api-access-t7jkc\") pod \"auto-csr-approver-29537548-jhcpw\" (UID: \"08a2756a-1558-472d-8e33-f3b8009eadab\") " pod="openshift-infra/auto-csr-approver-29537548-jhcpw" Feb 28 04:28:00 crc kubenswrapper[5072]: I0228 04:28:00.309567 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7jkc\" (UniqueName: \"kubernetes.io/projected/08a2756a-1558-472d-8e33-f3b8009eadab-kube-api-access-t7jkc\") pod \"auto-csr-approver-29537548-jhcpw\" (UID: \"08a2756a-1558-472d-8e33-f3b8009eadab\") " pod="openshift-infra/auto-csr-approver-29537548-jhcpw" Feb 28 04:28:00 crc kubenswrapper[5072]: I0228 04:28:00.457706 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537548-jhcpw" Feb 28 04:28:00 crc kubenswrapper[5072]: I0228 04:28:00.738507 5072 generic.go:334] "Generic (PLEG): container finished" podID="46bb064f-a1bd-40b6-baaf-0ed5f71c926d" containerID="c99c67d773d47537e40138ff4e744bc04738a379e6121f5997c86d12e6ec484e" exitCode=0 Feb 28 04:28:00 crc kubenswrapper[5072]: I0228 04:28:00.738728 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d" event={"ID":"46bb064f-a1bd-40b6-baaf-0ed5f71c926d","Type":"ContainerDied","Data":"c99c67d773d47537e40138ff4e744bc04738a379e6121f5997c86d12e6ec484e"} Feb 28 04:28:00 crc kubenswrapper[5072]: I0228 04:28:00.738822 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d" event={"ID":"46bb064f-a1bd-40b6-baaf-0ed5f71c926d","Type":"ContainerStarted","Data":"437c3885cc0b5f8641539beebbff550f4bb23a37b277fcaff3097eb86e388799"} Feb 28 04:28:00 crc kubenswrapper[5072]: I0228 04:28:00.902389 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537548-jhcpw"] Feb 28 04:28:00 crc kubenswrapper[5072]: W0228 04:28:00.903610 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08a2756a_1558_472d_8e33_f3b8009eadab.slice/crio-76cc2edea49b0cc93cf49656eeba9d7532c6f61d4c3e25360085fa49c75eb2fc WatchSource:0}: Error finding container 76cc2edea49b0cc93cf49656eeba9d7532c6f61d4c3e25360085fa49c75eb2fc: Status 404 returned error can't find the container with id 76cc2edea49b0cc93cf49656eeba9d7532c6f61d4c3e25360085fa49c75eb2fc Feb 28 04:28:01 crc kubenswrapper[5072]: I0228 04:28:01.746907 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537548-jhcpw" event={"ID":"08a2756a-1558-472d-8e33-f3b8009eadab","Type":"ContainerStarted","Data":"76cc2edea49b0cc93cf49656eeba9d7532c6f61d4c3e25360085fa49c75eb2fc"} Feb 28 04:28:01 crc kubenswrapper[5072]: I0228 04:28:01.750448 5072 generic.go:334] "Generic (PLEG): container finished" podID="46bb064f-a1bd-40b6-baaf-0ed5f71c926d" containerID="f9638a0cec913e593ae258b1ac7834421159ad66a6576002408944fd84951445" exitCode=0 Feb 28 04:28:01 crc kubenswrapper[5072]: I0228 04:28:01.750486 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d" event={"ID":"46bb064f-a1bd-40b6-baaf-0ed5f71c926d","Type":"ContainerDied","Data":"f9638a0cec913e593ae258b1ac7834421159ad66a6576002408944fd84951445"} Feb 28 04:28:02 crc kubenswrapper[5072]: I0228 04:28:02.763598 5072 generic.go:334] "Generic (PLEG): container finished" podID="08a2756a-1558-472d-8e33-f3b8009eadab" containerID="1e397dde6a93677fc8e88c0f3e937758b2b430a51f0e349375967baad2b97822" exitCode=0 Feb 28 04:28:02 crc kubenswrapper[5072]: I0228 04:28:02.763700 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537548-jhcpw" event={"ID":"08a2756a-1558-472d-8e33-f3b8009eadab","Type":"ContainerDied","Data":"1e397dde6a93677fc8e88c0f3e937758b2b430a51f0e349375967baad2b97822"} Feb 28 04:28:02 crc kubenswrapper[5072]: I0228 04:28:02.766216 5072 generic.go:334] "Generic (PLEG): container finished" podID="46bb064f-a1bd-40b6-baaf-0ed5f71c926d" containerID="ff1d0c7815426e13f87eebffa095214fb81ce4500673644df05798044ffac016" exitCode=0 Feb 28 04:28:02 crc kubenswrapper[5072]: I0228 04:28:02.766258 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d" event={"ID":"46bb064f-a1bd-40b6-baaf-0ed5f71c926d","Type":"ContainerDied","Data":"ff1d0c7815426e13f87eebffa095214fb81ce4500673644df05798044ffac016"} Feb 28 04:28:04 crc kubenswrapper[5072]: I0228 04:28:04.066222 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537548-jhcpw" Feb 28 04:28:04 crc kubenswrapper[5072]: I0228 04:28:04.071832 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d" Feb 28 04:28:04 crc kubenswrapper[5072]: I0228 04:28:04.141932 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7jkc\" (UniqueName: \"kubernetes.io/projected/08a2756a-1558-472d-8e33-f3b8009eadab-kube-api-access-t7jkc\") pod \"08a2756a-1558-472d-8e33-f3b8009eadab\" (UID: \"08a2756a-1558-472d-8e33-f3b8009eadab\") " Feb 28 04:28:04 crc kubenswrapper[5072]: I0228 04:28:04.142095 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/46bb064f-a1bd-40b6-baaf-0ed5f71c926d-bundle\") pod \"46bb064f-a1bd-40b6-baaf-0ed5f71c926d\" (UID: \"46bb064f-a1bd-40b6-baaf-0ed5f71c926d\") " Feb 28 04:28:04 crc kubenswrapper[5072]: I0228 04:28:04.142134 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mhb2\" (UniqueName: \"kubernetes.io/projected/46bb064f-a1bd-40b6-baaf-0ed5f71c926d-kube-api-access-4mhb2\") pod \"46bb064f-a1bd-40b6-baaf-0ed5f71c926d\" (UID: \"46bb064f-a1bd-40b6-baaf-0ed5f71c926d\") " Feb 28 04:28:04 crc kubenswrapper[5072]: I0228 04:28:04.142176 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/46bb064f-a1bd-40b6-baaf-0ed5f71c926d-util\") pod \"46bb064f-a1bd-40b6-baaf-0ed5f71c926d\" (UID: \"46bb064f-a1bd-40b6-baaf-0ed5f71c926d\") " Feb 28 04:28:04 crc kubenswrapper[5072]: I0228 04:28:04.142963 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46bb064f-a1bd-40b6-baaf-0ed5f71c926d-bundle" (OuterVolumeSpecName: "bundle") pod "46bb064f-a1bd-40b6-baaf-0ed5f71c926d" (UID: "46bb064f-a1bd-40b6-baaf-0ed5f71c926d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:28:04 crc kubenswrapper[5072]: I0228 04:28:04.147930 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46bb064f-a1bd-40b6-baaf-0ed5f71c926d-kube-api-access-4mhb2" (OuterVolumeSpecName: "kube-api-access-4mhb2") pod "46bb064f-a1bd-40b6-baaf-0ed5f71c926d" (UID: "46bb064f-a1bd-40b6-baaf-0ed5f71c926d"). InnerVolumeSpecName "kube-api-access-4mhb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:28:04 crc kubenswrapper[5072]: I0228 04:28:04.148860 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08a2756a-1558-472d-8e33-f3b8009eadab-kube-api-access-t7jkc" (OuterVolumeSpecName: "kube-api-access-t7jkc") pod "08a2756a-1558-472d-8e33-f3b8009eadab" (UID: "08a2756a-1558-472d-8e33-f3b8009eadab"). InnerVolumeSpecName "kube-api-access-t7jkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:28:04 crc kubenswrapper[5072]: I0228 04:28:04.158194 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46bb064f-a1bd-40b6-baaf-0ed5f71c926d-util" (OuterVolumeSpecName: "util") pod "46bb064f-a1bd-40b6-baaf-0ed5f71c926d" (UID: "46bb064f-a1bd-40b6-baaf-0ed5f71c926d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:28:04 crc kubenswrapper[5072]: I0228 04:28:04.243761 5072 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/46bb064f-a1bd-40b6-baaf-0ed5f71c926d-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 04:28:04 crc kubenswrapper[5072]: I0228 04:28:04.243796 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mhb2\" (UniqueName: \"kubernetes.io/projected/46bb064f-a1bd-40b6-baaf-0ed5f71c926d-kube-api-access-4mhb2\") on node \"crc\" DevicePath \"\"" Feb 28 04:28:04 crc kubenswrapper[5072]: I0228 04:28:04.243806 5072 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/46bb064f-a1bd-40b6-baaf-0ed5f71c926d-util\") on node \"crc\" DevicePath \"\"" Feb 28 04:28:04 crc kubenswrapper[5072]: I0228 04:28:04.243818 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7jkc\" (UniqueName: \"kubernetes.io/projected/08a2756a-1558-472d-8e33-f3b8009eadab-kube-api-access-t7jkc\") on node \"crc\" DevicePath \"\"" Feb 28 04:28:04 crc kubenswrapper[5072]: I0228 04:28:04.779952 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537548-jhcpw" event={"ID":"08a2756a-1558-472d-8e33-f3b8009eadab","Type":"ContainerDied","Data":"76cc2edea49b0cc93cf49656eeba9d7532c6f61d4c3e25360085fa49c75eb2fc"} Feb 28 04:28:04 crc kubenswrapper[5072]: I0228 04:28:04.780226 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76cc2edea49b0cc93cf49656eeba9d7532c6f61d4c3e25360085fa49c75eb2fc" Feb 28 04:28:04 crc kubenswrapper[5072]: I0228 04:28:04.779992 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537548-jhcpw" Feb 28 04:28:04 crc kubenswrapper[5072]: I0228 04:28:04.782372 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d" event={"ID":"46bb064f-a1bd-40b6-baaf-0ed5f71c926d","Type":"ContainerDied","Data":"437c3885cc0b5f8641539beebbff550f4bb23a37b277fcaff3097eb86e388799"} Feb 28 04:28:04 crc kubenswrapper[5072]: I0228 04:28:04.782613 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="437c3885cc0b5f8641539beebbff550f4bb23a37b277fcaff3097eb86e388799" Feb 28 04:28:04 crc kubenswrapper[5072]: I0228 04:28:04.782430 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d" Feb 28 04:28:05 crc kubenswrapper[5072]: I0228 04:28:05.128461 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537542-98224"] Feb 28 04:28:05 crc kubenswrapper[5072]: I0228 04:28:05.133040 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537542-98224"] Feb 28 04:28:06 crc kubenswrapper[5072]: I0228 04:28:06.666844 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1652ce01-0324-46d3-8f09-e946acb926e4" path="/var/lib/kubelet/pods/1652ce01-0324-46d3-8f09-e946acb926e4/volumes" Feb 28 04:28:12 crc kubenswrapper[5072]: I0228 04:28:12.632256 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-784c7fcf4f-fvf4q"] Feb 28 04:28:12 crc kubenswrapper[5072]: E0228 04:28:12.634268 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08a2756a-1558-472d-8e33-f3b8009eadab" containerName="oc" Feb 28 04:28:12 crc kubenswrapper[5072]: I0228 04:28:12.634365 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a2756a-1558-472d-8e33-f3b8009eadab" containerName="oc" Feb 28 04:28:12 crc kubenswrapper[5072]: E0228 04:28:12.634447 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46bb064f-a1bd-40b6-baaf-0ed5f71c926d" containerName="util" Feb 28 04:28:12 crc kubenswrapper[5072]: I0228 04:28:12.634519 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="46bb064f-a1bd-40b6-baaf-0ed5f71c926d" containerName="util" Feb 28 04:28:12 crc kubenswrapper[5072]: E0228 04:28:12.634623 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46bb064f-a1bd-40b6-baaf-0ed5f71c926d" containerName="extract" Feb 28 04:28:12 crc kubenswrapper[5072]: I0228 04:28:12.634740 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="46bb064f-a1bd-40b6-baaf-0ed5f71c926d" containerName="extract" Feb 28 04:28:12 crc kubenswrapper[5072]: E0228 04:28:12.634838 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46bb064f-a1bd-40b6-baaf-0ed5f71c926d" containerName="pull" Feb 28 04:28:12 crc kubenswrapper[5072]: I0228 04:28:12.634909 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="46bb064f-a1bd-40b6-baaf-0ed5f71c926d" containerName="pull" Feb 28 04:28:12 crc kubenswrapper[5072]: I0228 04:28:12.635134 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="08a2756a-1558-472d-8e33-f3b8009eadab" containerName="oc" Feb 28 04:28:12 crc kubenswrapper[5072]: I0228 04:28:12.635268 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="46bb064f-a1bd-40b6-baaf-0ed5f71c926d" containerName="extract" Feb 28 04:28:12 crc kubenswrapper[5072]: I0228 04:28:12.635888 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-784c7fcf4f-fvf4q" Feb 28 04:28:12 crc kubenswrapper[5072]: I0228 04:28:12.639536 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-784c7fcf4f-fvf4q"] Feb 28 04:28:12 crc kubenswrapper[5072]: I0228 04:28:12.641078 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-zhpc6" Feb 28 04:28:12 crc kubenswrapper[5072]: I0228 04:28:12.642021 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Feb 28 04:28:12 crc kubenswrapper[5072]: I0228 04:28:12.651416 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6325e48f-129d-4832-99d8-1cd8088708c3-apiservice-cert\") pod \"keystone-operator-controller-manager-784c7fcf4f-fvf4q\" (UID: \"6325e48f-129d-4832-99d8-1cd8088708c3\") " pod="openstack-operators/keystone-operator-controller-manager-784c7fcf4f-fvf4q" Feb 28 04:28:12 crc kubenswrapper[5072]: I0228 04:28:12.651777 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5ml6\" (UniqueName: \"kubernetes.io/projected/6325e48f-129d-4832-99d8-1cd8088708c3-kube-api-access-c5ml6\") pod \"keystone-operator-controller-manager-784c7fcf4f-fvf4q\" (UID: \"6325e48f-129d-4832-99d8-1cd8088708c3\") " pod="openstack-operators/keystone-operator-controller-manager-784c7fcf4f-fvf4q" Feb 28 04:28:12 crc kubenswrapper[5072]: I0228 04:28:12.652008 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6325e48f-129d-4832-99d8-1cd8088708c3-webhook-cert\") pod \"keystone-operator-controller-manager-784c7fcf4f-fvf4q\" (UID: \"6325e48f-129d-4832-99d8-1cd8088708c3\") " pod="openstack-operators/keystone-operator-controller-manager-784c7fcf4f-fvf4q" Feb 28 04:28:12 crc kubenswrapper[5072]: I0228 04:28:12.764313 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5ml6\" (UniqueName: \"kubernetes.io/projected/6325e48f-129d-4832-99d8-1cd8088708c3-kube-api-access-c5ml6\") pod \"keystone-operator-controller-manager-784c7fcf4f-fvf4q\" (UID: \"6325e48f-129d-4832-99d8-1cd8088708c3\") " pod="openstack-operators/keystone-operator-controller-manager-784c7fcf4f-fvf4q" Feb 28 04:28:12 crc kubenswrapper[5072]: I0228 04:28:12.764398 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6325e48f-129d-4832-99d8-1cd8088708c3-webhook-cert\") pod \"keystone-operator-controller-manager-784c7fcf4f-fvf4q\" (UID: \"6325e48f-129d-4832-99d8-1cd8088708c3\") " pod="openstack-operators/keystone-operator-controller-manager-784c7fcf4f-fvf4q" Feb 28 04:28:12 crc kubenswrapper[5072]: I0228 04:28:12.764499 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6325e48f-129d-4832-99d8-1cd8088708c3-apiservice-cert\") pod \"keystone-operator-controller-manager-784c7fcf4f-fvf4q\" (UID: \"6325e48f-129d-4832-99d8-1cd8088708c3\") " pod="openstack-operators/keystone-operator-controller-manager-784c7fcf4f-fvf4q" Feb 28 04:28:12 crc kubenswrapper[5072]: I0228 04:28:12.795117 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6325e48f-129d-4832-99d8-1cd8088708c3-apiservice-cert\") pod \"keystone-operator-controller-manager-784c7fcf4f-fvf4q\" (UID: \"6325e48f-129d-4832-99d8-1cd8088708c3\") " pod="openstack-operators/keystone-operator-controller-manager-784c7fcf4f-fvf4q" Feb 28 04:28:12 crc kubenswrapper[5072]: I0228 04:28:12.795163 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6325e48f-129d-4832-99d8-1cd8088708c3-webhook-cert\") pod \"keystone-operator-controller-manager-784c7fcf4f-fvf4q\" (UID: \"6325e48f-129d-4832-99d8-1cd8088708c3\") " pod="openstack-operators/keystone-operator-controller-manager-784c7fcf4f-fvf4q" Feb 28 04:28:12 crc kubenswrapper[5072]: I0228 04:28:12.809876 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5ml6\" (UniqueName: \"kubernetes.io/projected/6325e48f-129d-4832-99d8-1cd8088708c3-kube-api-access-c5ml6\") pod \"keystone-operator-controller-manager-784c7fcf4f-fvf4q\" (UID: \"6325e48f-129d-4832-99d8-1cd8088708c3\") " pod="openstack-operators/keystone-operator-controller-manager-784c7fcf4f-fvf4q" Feb 28 04:28:12 crc kubenswrapper[5072]: I0228 04:28:12.959814 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-784c7fcf4f-fvf4q" Feb 28 04:28:13 crc kubenswrapper[5072]: I0228 04:28:13.371837 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-784c7fcf4f-fvf4q"] Feb 28 04:28:13 crc kubenswrapper[5072]: I0228 04:28:13.841629 5072 generic.go:334] "Generic (PLEG): container finished" podID="c71c158a-9876-4f8e-9100-7c0a36834415" containerID="25cd65eea13a4dec7c45316f939fe6c7a1ad58c61a487560531199a80555628e" exitCode=0 Feb 28 04:28:13 crc kubenswrapper[5072]: I0228 04:28:13.841744 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/rabbitmq-server-0" event={"ID":"c71c158a-9876-4f8e-9100-7c0a36834415","Type":"ContainerDied","Data":"25cd65eea13a4dec7c45316f939fe6c7a1ad58c61a487560531199a80555628e"} Feb 28 04:28:13 crc kubenswrapper[5072]: I0228 04:28:13.842929 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-784c7fcf4f-fvf4q" event={"ID":"6325e48f-129d-4832-99d8-1cd8088708c3","Type":"ContainerStarted","Data":"297f2d54391d1c87255c4371f6bee8d33f5f083f0dc1db2a939647c519af08c0"} Feb 28 04:28:14 crc kubenswrapper[5072]: I0228 04:28:14.852181 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/rabbitmq-server-0" event={"ID":"c71c158a-9876-4f8e-9100-7c0a36834415","Type":"ContainerStarted","Data":"e814f8979cd987bed1ba685abb070637ffd1c54d3b8c5ac3cb6a6a5a450e7185"} Feb 28 04:28:14 crc kubenswrapper[5072]: I0228 04:28:14.853340 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/rabbitmq-server-0" Feb 28 04:28:14 crc kubenswrapper[5072]: I0228 04:28:14.875890 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/rabbitmq-server-0" podStartSLOduration=36.618506347 podStartE2EDuration="47.875865614s" podCreationTimestamp="2026-02-28 04:27:27 +0000 UTC" firstStartedPulling="2026-02-28 04:27:29.491586324 +0000 UTC m=+1071.486316516" lastFinishedPulling="2026-02-28 04:27:40.748945591 +0000 UTC m=+1082.743675783" observedRunningTime="2026-02-28 04:28:14.870455954 +0000 UTC m=+1116.865186166" watchObservedRunningTime="2026-02-28 04:28:14.875865614 +0000 UTC m=+1116.870595816" Feb 28 04:28:18 crc kubenswrapper[5072]: I0228 04:28:18.885182 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-784c7fcf4f-fvf4q" event={"ID":"6325e48f-129d-4832-99d8-1cd8088708c3","Type":"ContainerStarted","Data":"9257be1e822cb1d78e5876a482a752924d94b6f97d2c75b12e040888675fc056"} Feb 28 04:28:18 crc kubenswrapper[5072]: I0228 04:28:18.885793 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-784c7fcf4f-fvf4q" Feb 28 04:28:18 crc kubenswrapper[5072]: I0228 04:28:18.910707 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-784c7fcf4f-fvf4q" podStartSLOduration=2.558891558 podStartE2EDuration="6.910688434s" podCreationTimestamp="2026-02-28 04:28:12 +0000 UTC" firstStartedPulling="2026-02-28 04:28:13.383122164 +0000 UTC m=+1115.377852366" lastFinishedPulling="2026-02-28 04:28:17.73491905 +0000 UTC m=+1119.729649242" observedRunningTime="2026-02-28 04:28:18.907580246 +0000 UTC m=+1120.902310428" watchObservedRunningTime="2026-02-28 04:28:18.910688434 +0000 UTC m=+1120.905418626" Feb 28 04:28:20 crc kubenswrapper[5072]: I0228 04:28:20.434801 5072 scope.go:117] "RemoveContainer" containerID="b027ac162ddf00b261727df3c966b678a3f6e1c5500fe190c8c13f10a09e355f" Feb 28 04:28:22 crc kubenswrapper[5072]: I0228 04:28:22.964680 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-784c7fcf4f-fvf4q" Feb 28 04:28:28 crc kubenswrapper[5072]: I0228 04:28:28.105878 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/keystone-db-create-lkm6s"] Feb 28 04:28:28 crc kubenswrapper[5072]: I0228 04:28:28.107178 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-db-create-lkm6s" Feb 28 04:28:28 crc kubenswrapper[5072]: I0228 04:28:28.113953 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystone-db-create-lkm6s"] Feb 28 04:28:28 crc kubenswrapper[5072]: I0228 04:28:28.124098 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/keystone-f9dc-account-create-update-b59fq"] Feb 28 04:28:28 crc kubenswrapper[5072]: I0228 04:28:28.124905 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-f9dc-account-create-update-b59fq" Feb 28 04:28:28 crc kubenswrapper[5072]: I0228 04:28:28.127266 5072 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone-db-secret" Feb 28 04:28:28 crc kubenswrapper[5072]: I0228 04:28:28.143612 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystone-f9dc-account-create-update-b59fq"] Feb 28 04:28:28 crc kubenswrapper[5072]: I0228 04:28:28.287830 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e18fcea-e598-4728-9708-b423c2f5686b-operator-scripts\") pod \"keystone-db-create-lkm6s\" (UID: \"8e18fcea-e598-4728-9708-b423c2f5686b\") " pod="horizon-kuttl-tests/keystone-db-create-lkm6s" Feb 28 04:28:28 crc kubenswrapper[5072]: I0228 04:28:28.287940 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df9b3873-ce74-47e3-a875-d07950e69125-operator-scripts\") pod \"keystone-f9dc-account-create-update-b59fq\" (UID: \"df9b3873-ce74-47e3-a875-d07950e69125\") " pod="horizon-kuttl-tests/keystone-f9dc-account-create-update-b59fq" Feb 28 04:28:28 crc kubenswrapper[5072]: I0228 04:28:28.287973 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9qjr\" (UniqueName: \"kubernetes.io/projected/8e18fcea-e598-4728-9708-b423c2f5686b-kube-api-access-w9qjr\") pod \"keystone-db-create-lkm6s\" (UID: \"8e18fcea-e598-4728-9708-b423c2f5686b\") " pod="horizon-kuttl-tests/keystone-db-create-lkm6s" Feb 28 04:28:28 crc kubenswrapper[5072]: I0228 04:28:28.288003 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6tnx\" (UniqueName: \"kubernetes.io/projected/df9b3873-ce74-47e3-a875-d07950e69125-kube-api-access-h6tnx\") pod \"keystone-f9dc-account-create-update-b59fq\" (UID: \"df9b3873-ce74-47e3-a875-d07950e69125\") " pod="horizon-kuttl-tests/keystone-f9dc-account-create-update-b59fq" Feb 28 04:28:28 crc kubenswrapper[5072]: I0228 04:28:28.389806 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df9b3873-ce74-47e3-a875-d07950e69125-operator-scripts\") pod \"keystone-f9dc-account-create-update-b59fq\" (UID: \"df9b3873-ce74-47e3-a875-d07950e69125\") " pod="horizon-kuttl-tests/keystone-f9dc-account-create-update-b59fq" Feb 28 04:28:28 crc kubenswrapper[5072]: I0228 04:28:28.389866 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9qjr\" (UniqueName: \"kubernetes.io/projected/8e18fcea-e598-4728-9708-b423c2f5686b-kube-api-access-w9qjr\") pod \"keystone-db-create-lkm6s\" (UID: \"8e18fcea-e598-4728-9708-b423c2f5686b\") " pod="horizon-kuttl-tests/keystone-db-create-lkm6s" Feb 28 04:28:28 crc kubenswrapper[5072]: I0228 04:28:28.389907 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6tnx\" (UniqueName: \"kubernetes.io/projected/df9b3873-ce74-47e3-a875-d07950e69125-kube-api-access-h6tnx\") pod \"keystone-f9dc-account-create-update-b59fq\" (UID: \"df9b3873-ce74-47e3-a875-d07950e69125\") " pod="horizon-kuttl-tests/keystone-f9dc-account-create-update-b59fq" Feb 28 04:28:28 crc kubenswrapper[5072]: I0228 04:28:28.389953 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e18fcea-e598-4728-9708-b423c2f5686b-operator-scripts\") pod \"keystone-db-create-lkm6s\" (UID: \"8e18fcea-e598-4728-9708-b423c2f5686b\") " pod="horizon-kuttl-tests/keystone-db-create-lkm6s" Feb 28 04:28:28 crc kubenswrapper[5072]: I0228 04:28:28.390908 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e18fcea-e598-4728-9708-b423c2f5686b-operator-scripts\") pod \"keystone-db-create-lkm6s\" (UID: \"8e18fcea-e598-4728-9708-b423c2f5686b\") " pod="horizon-kuttl-tests/keystone-db-create-lkm6s" Feb 28 04:28:28 crc kubenswrapper[5072]: I0228 04:28:28.391088 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df9b3873-ce74-47e3-a875-d07950e69125-operator-scripts\") pod \"keystone-f9dc-account-create-update-b59fq\" (UID: \"df9b3873-ce74-47e3-a875-d07950e69125\") " pod="horizon-kuttl-tests/keystone-f9dc-account-create-update-b59fq" Feb 28 04:28:28 crc kubenswrapper[5072]: I0228 04:28:28.410981 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6tnx\" (UniqueName: \"kubernetes.io/projected/df9b3873-ce74-47e3-a875-d07950e69125-kube-api-access-h6tnx\") pod \"keystone-f9dc-account-create-update-b59fq\" (UID: \"df9b3873-ce74-47e3-a875-d07950e69125\") " pod="horizon-kuttl-tests/keystone-f9dc-account-create-update-b59fq" Feb 28 04:28:28 crc kubenswrapper[5072]: I0228 04:28:28.418270 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9qjr\" (UniqueName: \"kubernetes.io/projected/8e18fcea-e598-4728-9708-b423c2f5686b-kube-api-access-w9qjr\") pod \"keystone-db-create-lkm6s\" (UID: \"8e18fcea-e598-4728-9708-b423c2f5686b\") " pod="horizon-kuttl-tests/keystone-db-create-lkm6s" Feb 28 04:28:28 crc kubenswrapper[5072]: I0228 04:28:28.426389 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-db-create-lkm6s" Feb 28 04:28:28 crc kubenswrapper[5072]: I0228 04:28:28.444115 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-f9dc-account-create-update-b59fq" Feb 28 04:28:28 crc kubenswrapper[5072]: I0228 04:28:28.896828 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystone-f9dc-account-create-update-b59fq"] Feb 28 04:28:28 crc kubenswrapper[5072]: I0228 04:28:28.932560 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystone-db-create-lkm6s"] Feb 28 04:28:28 crc kubenswrapper[5072]: I0228 04:28:28.956084 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-db-create-lkm6s" event={"ID":"8e18fcea-e598-4728-9708-b423c2f5686b","Type":"ContainerStarted","Data":"163820be98542dbe7bc95f19ccfe37c92d00b166208f9efc8c685517b6695348"} Feb 28 04:28:28 crc kubenswrapper[5072]: I0228 04:28:28.962123 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-f9dc-account-create-update-b59fq" event={"ID":"df9b3873-ce74-47e3-a875-d07950e69125","Type":"ContainerStarted","Data":"77bcdb4796a59bcd2296dbc31109c153693119799a99eafaa515bfea56a4512c"} Feb 28 04:28:29 crc kubenswrapper[5072]: I0228 04:28:29.041836 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="horizon-kuttl-tests/rabbitmq-server-0" Feb 28 04:28:29 crc kubenswrapper[5072]: I0228 04:28:29.969129 5072 generic.go:334] "Generic (PLEG): container finished" podID="8e18fcea-e598-4728-9708-b423c2f5686b" containerID="24d89ef02fafea74f917037f7a02488a825cb5c52148d4d1e00fad0f9a2149fb" exitCode=0 Feb 28 04:28:29 crc kubenswrapper[5072]: I0228 04:28:29.969404 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-db-create-lkm6s" event={"ID":"8e18fcea-e598-4728-9708-b423c2f5686b","Type":"ContainerDied","Data":"24d89ef02fafea74f917037f7a02488a825cb5c52148d4d1e00fad0f9a2149fb"} Feb 28 04:28:29 crc kubenswrapper[5072]: I0228 04:28:29.970930 5072 generic.go:334] "Generic (PLEG): container finished" podID="df9b3873-ce74-47e3-a875-d07950e69125" containerID="a25b28fed269ede81a5fc760a79dfb9947cece8bd5bbc9f8b86506a2ad6f18ec" exitCode=0 Feb 28 04:28:29 crc kubenswrapper[5072]: I0228 04:28:29.970953 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-f9dc-account-create-update-b59fq" event={"ID":"df9b3873-ce74-47e3-a875-d07950e69125","Type":"ContainerDied","Data":"a25b28fed269ede81a5fc760a79dfb9947cece8bd5bbc9f8b86506a2ad6f18ec"} Feb 28 04:28:31 crc kubenswrapper[5072]: I0228 04:28:31.272805 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-db-create-lkm6s" Feb 28 04:28:31 crc kubenswrapper[5072]: I0228 04:28:31.277791 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-f9dc-account-create-update-b59fq" Feb 28 04:28:31 crc kubenswrapper[5072]: I0228 04:28:31.440680 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6tnx\" (UniqueName: \"kubernetes.io/projected/df9b3873-ce74-47e3-a875-d07950e69125-kube-api-access-h6tnx\") pod \"df9b3873-ce74-47e3-a875-d07950e69125\" (UID: \"df9b3873-ce74-47e3-a875-d07950e69125\") " Feb 28 04:28:31 crc kubenswrapper[5072]: I0228 04:28:31.440767 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9qjr\" (UniqueName: \"kubernetes.io/projected/8e18fcea-e598-4728-9708-b423c2f5686b-kube-api-access-w9qjr\") pod \"8e18fcea-e598-4728-9708-b423c2f5686b\" (UID: \"8e18fcea-e598-4728-9708-b423c2f5686b\") " Feb 28 04:28:31 crc kubenswrapper[5072]: I0228 04:28:31.440798 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df9b3873-ce74-47e3-a875-d07950e69125-operator-scripts\") pod \"df9b3873-ce74-47e3-a875-d07950e69125\" (UID: \"df9b3873-ce74-47e3-a875-d07950e69125\") " Feb 28 04:28:31 crc kubenswrapper[5072]: I0228 04:28:31.441315 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df9b3873-ce74-47e3-a875-d07950e69125-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "df9b3873-ce74-47e3-a875-d07950e69125" (UID: "df9b3873-ce74-47e3-a875-d07950e69125"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:28:31 crc kubenswrapper[5072]: I0228 04:28:31.441438 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e18fcea-e598-4728-9708-b423c2f5686b-operator-scripts\") pod \"8e18fcea-e598-4728-9708-b423c2f5686b\" (UID: \"8e18fcea-e598-4728-9708-b423c2f5686b\") " Feb 28 04:28:31 crc kubenswrapper[5072]: I0228 04:28:31.441854 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e18fcea-e598-4728-9708-b423c2f5686b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8e18fcea-e598-4728-9708-b423c2f5686b" (UID: "8e18fcea-e598-4728-9708-b423c2f5686b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:28:31 crc kubenswrapper[5072]: I0228 04:28:31.442101 5072 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e18fcea-e598-4728-9708-b423c2f5686b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 04:28:31 crc kubenswrapper[5072]: I0228 04:28:31.442318 5072 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df9b3873-ce74-47e3-a875-d07950e69125-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 04:28:31 crc kubenswrapper[5072]: I0228 04:28:31.448831 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e18fcea-e598-4728-9708-b423c2f5686b-kube-api-access-w9qjr" (OuterVolumeSpecName: "kube-api-access-w9qjr") pod "8e18fcea-e598-4728-9708-b423c2f5686b" (UID: "8e18fcea-e598-4728-9708-b423c2f5686b"). InnerVolumeSpecName "kube-api-access-w9qjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:28:31 crc kubenswrapper[5072]: I0228 04:28:31.448887 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df9b3873-ce74-47e3-a875-d07950e69125-kube-api-access-h6tnx" (OuterVolumeSpecName: "kube-api-access-h6tnx") pod "df9b3873-ce74-47e3-a875-d07950e69125" (UID: "df9b3873-ce74-47e3-a875-d07950e69125"). InnerVolumeSpecName "kube-api-access-h6tnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:28:31 crc kubenswrapper[5072]: I0228 04:28:31.543931 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6tnx\" (UniqueName: \"kubernetes.io/projected/df9b3873-ce74-47e3-a875-d07950e69125-kube-api-access-h6tnx\") on node \"crc\" DevicePath \"\"" Feb 28 04:28:31 crc kubenswrapper[5072]: I0228 04:28:31.543977 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9qjr\" (UniqueName: \"kubernetes.io/projected/8e18fcea-e598-4728-9708-b423c2f5686b-kube-api-access-w9qjr\") on node \"crc\" DevicePath \"\"" Feb 28 04:28:31 crc kubenswrapper[5072]: I0228 04:28:31.983319 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-db-create-lkm6s" event={"ID":"8e18fcea-e598-4728-9708-b423c2f5686b","Type":"ContainerDied","Data":"163820be98542dbe7bc95f19ccfe37c92d00b166208f9efc8c685517b6695348"} Feb 28 04:28:31 crc kubenswrapper[5072]: I0228 04:28:31.983365 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="163820be98542dbe7bc95f19ccfe37c92d00b166208f9efc8c685517b6695348" Feb 28 04:28:31 crc kubenswrapper[5072]: I0228 04:28:31.983775 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-db-create-lkm6s" Feb 28 04:28:31 crc kubenswrapper[5072]: I0228 04:28:31.984906 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-f9dc-account-create-update-b59fq" event={"ID":"df9b3873-ce74-47e3-a875-d07950e69125","Type":"ContainerDied","Data":"77bcdb4796a59bcd2296dbc31109c153693119799a99eafaa515bfea56a4512c"} Feb 28 04:28:31 crc kubenswrapper[5072]: I0228 04:28:31.984953 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77bcdb4796a59bcd2296dbc31109c153693119799a99eafaa515bfea56a4512c" Feb 28 04:28:31 crc kubenswrapper[5072]: I0228 04:28:31.984966 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-f9dc-account-create-update-b59fq" Feb 28 04:28:33 crc kubenswrapper[5072]: I0228 04:28:33.674194 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/keystone-db-sync-lxxwv"] Feb 28 04:28:33 crc kubenswrapper[5072]: E0228 04:28:33.674716 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e18fcea-e598-4728-9708-b423c2f5686b" containerName="mariadb-database-create" Feb 28 04:28:33 crc kubenswrapper[5072]: I0228 04:28:33.674730 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e18fcea-e598-4728-9708-b423c2f5686b" containerName="mariadb-database-create" Feb 28 04:28:33 crc kubenswrapper[5072]: E0228 04:28:33.674758 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df9b3873-ce74-47e3-a875-d07950e69125" containerName="mariadb-account-create-update" Feb 28 04:28:33 crc kubenswrapper[5072]: I0228 04:28:33.674764 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="df9b3873-ce74-47e3-a875-d07950e69125" containerName="mariadb-account-create-update" Feb 28 04:28:33 crc kubenswrapper[5072]: I0228 04:28:33.674873 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e18fcea-e598-4728-9708-b423c2f5686b" containerName="mariadb-database-create" Feb 28 04:28:33 crc kubenswrapper[5072]: I0228 04:28:33.674884 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="df9b3873-ce74-47e3-a875-d07950e69125" containerName="mariadb-account-create-update" Feb 28 04:28:33 crc kubenswrapper[5072]: I0228 04:28:33.675298 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-db-sync-lxxwv" Feb 28 04:28:33 crc kubenswrapper[5072]: I0228 04:28:33.678358 5072 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone-keystone-dockercfg-r2z6h" Feb 28 04:28:33 crc kubenswrapper[5072]: I0228 04:28:33.678771 5072 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone-config-data" Feb 28 04:28:33 crc kubenswrapper[5072]: I0228 04:28:33.681544 5072 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone" Feb 28 04:28:33 crc kubenswrapper[5072]: I0228 04:28:33.682490 5072 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone-scripts" Feb 28 04:28:33 crc kubenswrapper[5072]: I0228 04:28:33.690442 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystone-db-sync-lxxwv"] Feb 28 04:28:33 crc kubenswrapper[5072]: I0228 04:28:33.775324 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxh92\" (UniqueName: \"kubernetes.io/projected/17c3258d-a634-499a-98b2-3bf41a18a4b1-kube-api-access-bxh92\") pod \"keystone-db-sync-lxxwv\" (UID: \"17c3258d-a634-499a-98b2-3bf41a18a4b1\") " pod="horizon-kuttl-tests/keystone-db-sync-lxxwv" Feb 28 04:28:33 crc kubenswrapper[5072]: I0228 04:28:33.775388 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17c3258d-a634-499a-98b2-3bf41a18a4b1-config-data\") pod \"keystone-db-sync-lxxwv\" (UID: \"17c3258d-a634-499a-98b2-3bf41a18a4b1\") " pod="horizon-kuttl-tests/keystone-db-sync-lxxwv" Feb 28 04:28:33 crc kubenswrapper[5072]: I0228 04:28:33.877146 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxh92\" (UniqueName: \"kubernetes.io/projected/17c3258d-a634-499a-98b2-3bf41a18a4b1-kube-api-access-bxh92\") pod \"keystone-db-sync-lxxwv\" (UID: \"17c3258d-a634-499a-98b2-3bf41a18a4b1\") " pod="horizon-kuttl-tests/keystone-db-sync-lxxwv" Feb 28 04:28:33 crc kubenswrapper[5072]: I0228 04:28:33.877203 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17c3258d-a634-499a-98b2-3bf41a18a4b1-config-data\") pod \"keystone-db-sync-lxxwv\" (UID: \"17c3258d-a634-499a-98b2-3bf41a18a4b1\") " pod="horizon-kuttl-tests/keystone-db-sync-lxxwv" Feb 28 04:28:33 crc kubenswrapper[5072]: I0228 04:28:33.883015 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17c3258d-a634-499a-98b2-3bf41a18a4b1-config-data\") pod \"keystone-db-sync-lxxwv\" (UID: \"17c3258d-a634-499a-98b2-3bf41a18a4b1\") " pod="horizon-kuttl-tests/keystone-db-sync-lxxwv" Feb 28 04:28:33 crc kubenswrapper[5072]: I0228 04:28:33.895627 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxh92\" (UniqueName: \"kubernetes.io/projected/17c3258d-a634-499a-98b2-3bf41a18a4b1-kube-api-access-bxh92\") pod \"keystone-db-sync-lxxwv\" (UID: \"17c3258d-a634-499a-98b2-3bf41a18a4b1\") " pod="horizon-kuttl-tests/keystone-db-sync-lxxwv" Feb 28 04:28:33 crc kubenswrapper[5072]: I0228 04:28:33.996937 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-db-sync-lxxwv" Feb 28 04:28:34 crc kubenswrapper[5072]: I0228 04:28:34.388764 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystone-db-sync-lxxwv"] Feb 28 04:28:34 crc kubenswrapper[5072]: W0228 04:28:34.399888 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17c3258d_a634_499a_98b2_3bf41a18a4b1.slice/crio-128e099c497ee2f6940c57ba5596fb617cae0586cd81531fb244a286b6d6cc7a WatchSource:0}: Error finding container 128e099c497ee2f6940c57ba5596fb617cae0586cd81531fb244a286b6d6cc7a: Status 404 returned error can't find the container with id 128e099c497ee2f6940c57ba5596fb617cae0586cd81531fb244a286b6d6cc7a Feb 28 04:28:35 crc kubenswrapper[5072]: I0228 04:28:35.009228 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-db-sync-lxxwv" event={"ID":"17c3258d-a634-499a-98b2-3bf41a18a4b1","Type":"ContainerStarted","Data":"128e099c497ee2f6940c57ba5596fb617cae0586cd81531fb244a286b6d6cc7a"} Feb 28 04:28:37 crc kubenswrapper[5072]: I0228 04:28:37.829259 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-index-4gztb"] Feb 28 04:28:37 crc kubenswrapper[5072]: I0228 04:28:37.830265 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-4gztb" Feb 28 04:28:37 crc kubenswrapper[5072]: I0228 04:28:37.835573 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-index-dockercfg-56rsf" Feb 28 04:28:37 crc kubenswrapper[5072]: I0228 04:28:37.838476 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-4gztb"] Feb 28 04:28:37 crc kubenswrapper[5072]: I0228 04:28:37.941397 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4fp6\" (UniqueName: \"kubernetes.io/projected/b9ce90ac-7ea8-44b6-bfae-05f51789c804-kube-api-access-q4fp6\") pod \"horizon-operator-index-4gztb\" (UID: \"b9ce90ac-7ea8-44b6-bfae-05f51789c804\") " pod="openstack-operators/horizon-operator-index-4gztb" Feb 28 04:28:38 crc kubenswrapper[5072]: I0228 04:28:38.042928 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4fp6\" (UniqueName: \"kubernetes.io/projected/b9ce90ac-7ea8-44b6-bfae-05f51789c804-kube-api-access-q4fp6\") pod \"horizon-operator-index-4gztb\" (UID: \"b9ce90ac-7ea8-44b6-bfae-05f51789c804\") " pod="openstack-operators/horizon-operator-index-4gztb" Feb 28 04:28:38 crc kubenswrapper[5072]: I0228 04:28:38.065562 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4fp6\" (UniqueName: \"kubernetes.io/projected/b9ce90ac-7ea8-44b6-bfae-05f51789c804-kube-api-access-q4fp6\") pod \"horizon-operator-index-4gztb\" (UID: \"b9ce90ac-7ea8-44b6-bfae-05f51789c804\") " pod="openstack-operators/horizon-operator-index-4gztb" Feb 28 04:28:38 crc kubenswrapper[5072]: I0228 04:28:38.151429 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-4gztb" Feb 28 04:28:41 crc kubenswrapper[5072]: W0228 04:28:41.564619 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9ce90ac_7ea8_44b6_bfae_05f51789c804.slice/crio-ec5d3a3d26432bfb5e010bbf5e39a59654ec9329225572915915c16db19fe0ae WatchSource:0}: Error finding container ec5d3a3d26432bfb5e010bbf5e39a59654ec9329225572915915c16db19fe0ae: Status 404 returned error can't find the container with id ec5d3a3d26432bfb5e010bbf5e39a59654ec9329225572915915c16db19fe0ae Feb 28 04:28:41 crc kubenswrapper[5072]: I0228 04:28:41.565707 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-4gztb"] Feb 28 04:28:42 crc kubenswrapper[5072]: I0228 04:28:42.067262 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-db-sync-lxxwv" event={"ID":"17c3258d-a634-499a-98b2-3bf41a18a4b1","Type":"ContainerStarted","Data":"3bcec2ae5530ff7b04b1f31f5f972f7f68cf676d949e3a12f3852a7bb1ab9ecb"} Feb 28 04:28:42 crc kubenswrapper[5072]: I0228 04:28:42.068357 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-4gztb" event={"ID":"b9ce90ac-7ea8-44b6-bfae-05f51789c804","Type":"ContainerStarted","Data":"ec5d3a3d26432bfb5e010bbf5e39a59654ec9329225572915915c16db19fe0ae"} Feb 28 04:28:42 crc kubenswrapper[5072]: I0228 04:28:42.088700 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/keystone-db-sync-lxxwv" podStartSLOduration=2.2850354 podStartE2EDuration="9.088679296s" podCreationTimestamp="2026-02-28 04:28:33 +0000 UTC" firstStartedPulling="2026-02-28 04:28:34.401503186 +0000 UTC m=+1136.396233378" lastFinishedPulling="2026-02-28 04:28:41.205147082 +0000 UTC m=+1143.199877274" observedRunningTime="2026-02-28 04:28:42.085395444 +0000 UTC m=+1144.080125646" watchObservedRunningTime="2026-02-28 04:28:42.088679296 +0000 UTC m=+1144.083409488" Feb 28 04:28:45 crc kubenswrapper[5072]: I0228 04:28:45.090900 5072 generic.go:334] "Generic (PLEG): container finished" podID="17c3258d-a634-499a-98b2-3bf41a18a4b1" containerID="3bcec2ae5530ff7b04b1f31f5f972f7f68cf676d949e3a12f3852a7bb1ab9ecb" exitCode=0 Feb 28 04:28:45 crc kubenswrapper[5072]: I0228 04:28:45.091092 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-db-sync-lxxwv" event={"ID":"17c3258d-a634-499a-98b2-3bf41a18a4b1","Type":"ContainerDied","Data":"3bcec2ae5530ff7b04b1f31f5f972f7f68cf676d949e3a12f3852a7bb1ab9ecb"} Feb 28 04:28:45 crc kubenswrapper[5072]: I0228 04:28:45.094104 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-4gztb" event={"ID":"b9ce90ac-7ea8-44b6-bfae-05f51789c804","Type":"ContainerStarted","Data":"d55011f7778e805e39e6353c6c5487e6eca17d1d876a7131971370a5925368e8"} Feb 28 04:28:45 crc kubenswrapper[5072]: I0228 04:28:45.124112 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-index-4gztb" podStartSLOduration=5.47770194 podStartE2EDuration="8.12409549s" podCreationTimestamp="2026-02-28 04:28:37 +0000 UTC" firstStartedPulling="2026-02-28 04:28:41.568287445 +0000 UTC m=+1143.563017637" lastFinishedPulling="2026-02-28 04:28:44.214680995 +0000 UTC m=+1146.209411187" observedRunningTime="2026-02-28 04:28:45.118577957 +0000 UTC m=+1147.113308149" watchObservedRunningTime="2026-02-28 04:28:45.12409549 +0000 UTC m=+1147.118825682" Feb 28 04:28:46 crc kubenswrapper[5072]: I0228 04:28:46.358464 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-db-sync-lxxwv" Feb 28 04:28:46 crc kubenswrapper[5072]: I0228 04:28:46.463466 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxh92\" (UniqueName: \"kubernetes.io/projected/17c3258d-a634-499a-98b2-3bf41a18a4b1-kube-api-access-bxh92\") pod \"17c3258d-a634-499a-98b2-3bf41a18a4b1\" (UID: \"17c3258d-a634-499a-98b2-3bf41a18a4b1\") " Feb 28 04:28:46 crc kubenswrapper[5072]: I0228 04:28:46.463539 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17c3258d-a634-499a-98b2-3bf41a18a4b1-config-data\") pod \"17c3258d-a634-499a-98b2-3bf41a18a4b1\" (UID: \"17c3258d-a634-499a-98b2-3bf41a18a4b1\") " Feb 28 04:28:46 crc kubenswrapper[5072]: I0228 04:28:46.468323 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17c3258d-a634-499a-98b2-3bf41a18a4b1-kube-api-access-bxh92" (OuterVolumeSpecName: "kube-api-access-bxh92") pod "17c3258d-a634-499a-98b2-3bf41a18a4b1" (UID: "17c3258d-a634-499a-98b2-3bf41a18a4b1"). InnerVolumeSpecName "kube-api-access-bxh92". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:28:46 crc kubenswrapper[5072]: I0228 04:28:46.493258 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17c3258d-a634-499a-98b2-3bf41a18a4b1-config-data" (OuterVolumeSpecName: "config-data") pod "17c3258d-a634-499a-98b2-3bf41a18a4b1" (UID: "17c3258d-a634-499a-98b2-3bf41a18a4b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:28:46 crc kubenswrapper[5072]: I0228 04:28:46.564864 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxh92\" (UniqueName: \"kubernetes.io/projected/17c3258d-a634-499a-98b2-3bf41a18a4b1-kube-api-access-bxh92\") on node \"crc\" DevicePath \"\"" Feb 28 04:28:46 crc kubenswrapper[5072]: I0228 04:28:46.564892 5072 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17c3258d-a634-499a-98b2-3bf41a18a4b1-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 04:28:47 crc kubenswrapper[5072]: I0228 04:28:47.105923 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-db-sync-lxxwv" event={"ID":"17c3258d-a634-499a-98b2-3bf41a18a4b1","Type":"ContainerDied","Data":"128e099c497ee2f6940c57ba5596fb617cae0586cd81531fb244a286b6d6cc7a"} Feb 28 04:28:47 crc kubenswrapper[5072]: I0228 04:28:47.106198 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="128e099c497ee2f6940c57ba5596fb617cae0586cd81531fb244a286b6d6cc7a" Feb 28 04:28:47 crc kubenswrapper[5072]: I0228 04:28:47.105997 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-db-sync-lxxwv" Feb 28 04:28:47 crc kubenswrapper[5072]: I0228 04:28:47.306802 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/keystone-bootstrap-gffh8"] Feb 28 04:28:47 crc kubenswrapper[5072]: E0228 04:28:47.307237 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17c3258d-a634-499a-98b2-3bf41a18a4b1" containerName="keystone-db-sync" Feb 28 04:28:47 crc kubenswrapper[5072]: I0228 04:28:47.307301 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="17c3258d-a634-499a-98b2-3bf41a18a4b1" containerName="keystone-db-sync" Feb 28 04:28:47 crc kubenswrapper[5072]: I0228 04:28:47.307464 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="17c3258d-a634-499a-98b2-3bf41a18a4b1" containerName="keystone-db-sync" Feb 28 04:28:47 crc kubenswrapper[5072]: I0228 04:28:47.308034 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-bootstrap-gffh8" Feb 28 04:28:47 crc kubenswrapper[5072]: I0228 04:28:47.310133 5072 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"osp-secret" Feb 28 04:28:47 crc kubenswrapper[5072]: I0228 04:28:47.310289 5072 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone-scripts" Feb 28 04:28:47 crc kubenswrapper[5072]: I0228 04:28:47.310488 5072 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone-config-data" Feb 28 04:28:47 crc kubenswrapper[5072]: I0228 04:28:47.311300 5072 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone" Feb 28 04:28:47 crc kubenswrapper[5072]: I0228 04:28:47.321561 5072 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone-keystone-dockercfg-r2z6h" Feb 28 04:28:47 crc kubenswrapper[5072]: I0228 04:28:47.321702 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystone-bootstrap-gffh8"] Feb 28 04:28:47 crc kubenswrapper[5072]: I0228 04:28:47.483563 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f489d842-f259-4a58-af65-16f8d04dfa07-credential-keys\") pod \"keystone-bootstrap-gffh8\" (UID: \"f489d842-f259-4a58-af65-16f8d04dfa07\") " pod="horizon-kuttl-tests/keystone-bootstrap-gffh8" Feb 28 04:28:47 crc kubenswrapper[5072]: I0228 04:28:47.483755 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f489d842-f259-4a58-af65-16f8d04dfa07-scripts\") pod \"keystone-bootstrap-gffh8\" (UID: \"f489d842-f259-4a58-af65-16f8d04dfa07\") " pod="horizon-kuttl-tests/keystone-bootstrap-gffh8" Feb 28 04:28:47 crc kubenswrapper[5072]: I0228 04:28:47.483791 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f489d842-f259-4a58-af65-16f8d04dfa07-fernet-keys\") pod \"keystone-bootstrap-gffh8\" (UID: \"f489d842-f259-4a58-af65-16f8d04dfa07\") " pod="horizon-kuttl-tests/keystone-bootstrap-gffh8" Feb 28 04:28:47 crc kubenswrapper[5072]: I0228 04:28:47.483831 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f489d842-f259-4a58-af65-16f8d04dfa07-config-data\") pod \"keystone-bootstrap-gffh8\" (UID: \"f489d842-f259-4a58-af65-16f8d04dfa07\") " pod="horizon-kuttl-tests/keystone-bootstrap-gffh8" Feb 28 04:28:47 crc kubenswrapper[5072]: I0228 04:28:47.484019 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsb6q\" (UniqueName: \"kubernetes.io/projected/f489d842-f259-4a58-af65-16f8d04dfa07-kube-api-access-xsb6q\") pod \"keystone-bootstrap-gffh8\" (UID: \"f489d842-f259-4a58-af65-16f8d04dfa07\") " pod="horizon-kuttl-tests/keystone-bootstrap-gffh8" Feb 28 04:28:47 crc kubenswrapper[5072]: I0228 04:28:47.586080 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f489d842-f259-4a58-af65-16f8d04dfa07-credential-keys\") pod \"keystone-bootstrap-gffh8\" (UID: \"f489d842-f259-4a58-af65-16f8d04dfa07\") " pod="horizon-kuttl-tests/keystone-bootstrap-gffh8" Feb 28 04:28:47 crc kubenswrapper[5072]: I0228 04:28:47.586180 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f489d842-f259-4a58-af65-16f8d04dfa07-scripts\") pod \"keystone-bootstrap-gffh8\" (UID: \"f489d842-f259-4a58-af65-16f8d04dfa07\") " pod="horizon-kuttl-tests/keystone-bootstrap-gffh8" Feb 28 04:28:47 crc kubenswrapper[5072]: I0228 04:28:47.586209 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f489d842-f259-4a58-af65-16f8d04dfa07-fernet-keys\") pod \"keystone-bootstrap-gffh8\" (UID: \"f489d842-f259-4a58-af65-16f8d04dfa07\") " pod="horizon-kuttl-tests/keystone-bootstrap-gffh8" Feb 28 04:28:47 crc kubenswrapper[5072]: I0228 04:28:47.586257 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f489d842-f259-4a58-af65-16f8d04dfa07-config-data\") pod \"keystone-bootstrap-gffh8\" (UID: \"f489d842-f259-4a58-af65-16f8d04dfa07\") " pod="horizon-kuttl-tests/keystone-bootstrap-gffh8" Feb 28 04:28:47 crc kubenswrapper[5072]: I0228 04:28:47.586307 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsb6q\" (UniqueName: \"kubernetes.io/projected/f489d842-f259-4a58-af65-16f8d04dfa07-kube-api-access-xsb6q\") pod \"keystone-bootstrap-gffh8\" (UID: \"f489d842-f259-4a58-af65-16f8d04dfa07\") " pod="horizon-kuttl-tests/keystone-bootstrap-gffh8" Feb 28 04:28:47 crc kubenswrapper[5072]: I0228 04:28:47.590935 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f489d842-f259-4a58-af65-16f8d04dfa07-scripts\") pod \"keystone-bootstrap-gffh8\" (UID: \"f489d842-f259-4a58-af65-16f8d04dfa07\") " pod="horizon-kuttl-tests/keystone-bootstrap-gffh8" Feb 28 04:28:47 crc kubenswrapper[5072]: I0228 04:28:47.590954 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f489d842-f259-4a58-af65-16f8d04dfa07-fernet-keys\") pod \"keystone-bootstrap-gffh8\" (UID: \"f489d842-f259-4a58-af65-16f8d04dfa07\") " pod="horizon-kuttl-tests/keystone-bootstrap-gffh8" Feb 28 04:28:47 crc kubenswrapper[5072]: I0228 04:28:47.590951 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f489d842-f259-4a58-af65-16f8d04dfa07-config-data\") pod \"keystone-bootstrap-gffh8\" (UID: \"f489d842-f259-4a58-af65-16f8d04dfa07\") " pod="horizon-kuttl-tests/keystone-bootstrap-gffh8" Feb 28 04:28:47 crc kubenswrapper[5072]: I0228 04:28:47.595050 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f489d842-f259-4a58-af65-16f8d04dfa07-credential-keys\") pod \"keystone-bootstrap-gffh8\" (UID: \"f489d842-f259-4a58-af65-16f8d04dfa07\") " pod="horizon-kuttl-tests/keystone-bootstrap-gffh8" Feb 28 04:28:47 crc kubenswrapper[5072]: I0228 04:28:47.610973 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsb6q\" (UniqueName: \"kubernetes.io/projected/f489d842-f259-4a58-af65-16f8d04dfa07-kube-api-access-xsb6q\") pod \"keystone-bootstrap-gffh8\" (UID: \"f489d842-f259-4a58-af65-16f8d04dfa07\") " pod="horizon-kuttl-tests/keystone-bootstrap-gffh8" Feb 28 04:28:47 crc kubenswrapper[5072]: I0228 04:28:47.625679 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-bootstrap-gffh8" Feb 28 04:28:48 crc kubenswrapper[5072]: W0228 04:28:48.141310 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf489d842_f259_4a58_af65_16f8d04dfa07.slice/crio-aec823341acb2f76c73183577bebf51a7b06862b90dafca7ba46a99f81c6b7b6 WatchSource:0}: Error finding container aec823341acb2f76c73183577bebf51a7b06862b90dafca7ba46a99f81c6b7b6: Status 404 returned error can't find the container with id aec823341acb2f76c73183577bebf51a7b06862b90dafca7ba46a99f81c6b7b6 Feb 28 04:28:48 crc kubenswrapper[5072]: I0228 04:28:48.142560 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystone-bootstrap-gffh8"] Feb 28 04:28:48 crc kubenswrapper[5072]: I0228 04:28:48.152607 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/horizon-operator-index-4gztb" Feb 28 04:28:48 crc kubenswrapper[5072]: I0228 04:28:48.152666 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-index-4gztb" Feb 28 04:28:48 crc kubenswrapper[5072]: I0228 04:28:48.196679 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/horizon-operator-index-4gztb" Feb 28 04:28:49 crc kubenswrapper[5072]: I0228 04:28:49.125786 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-bootstrap-gffh8" event={"ID":"f489d842-f259-4a58-af65-16f8d04dfa07","Type":"ContainerStarted","Data":"0cf647178618894fa41d0aabb4dd23ee31971bc7401ba75c71d69b60e4e65c83"} Feb 28 04:28:49 crc kubenswrapper[5072]: I0228 04:28:49.126262 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-bootstrap-gffh8" event={"ID":"f489d842-f259-4a58-af65-16f8d04dfa07","Type":"ContainerStarted","Data":"aec823341acb2f76c73183577bebf51a7b06862b90dafca7ba46a99f81c6b7b6"} Feb 28 04:28:49 crc kubenswrapper[5072]: I0228 04:28:49.152558 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/keystone-bootstrap-gffh8" podStartSLOduration=2.15254057 podStartE2EDuration="2.15254057s" podCreationTimestamp="2026-02-28 04:28:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:28:49.148793893 +0000 UTC m=+1151.143524085" watchObservedRunningTime="2026-02-28 04:28:49.15254057 +0000 UTC m=+1151.147270762" Feb 28 04:28:49 crc kubenswrapper[5072]: I0228 04:28:49.162551 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-index-4gztb" Feb 28 04:28:50 crc kubenswrapper[5072]: I0228 04:28:50.575235 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl"] Feb 28 04:28:50 crc kubenswrapper[5072]: I0228 04:28:50.576890 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl" Feb 28 04:28:50 crc kubenswrapper[5072]: I0228 04:28:50.586444 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-56tr7" Feb 28 04:28:50 crc kubenswrapper[5072]: I0228 04:28:50.595882 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl"] Feb 28 04:28:50 crc kubenswrapper[5072]: I0228 04:28:50.730252 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b9b43ee-d217-4d73-8029-176c01146473-util\") pod \"627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl\" (UID: \"2b9b43ee-d217-4d73-8029-176c01146473\") " pod="openstack-operators/627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl" Feb 28 04:28:50 crc kubenswrapper[5072]: I0228 04:28:50.730341 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fcmj\" (UniqueName: \"kubernetes.io/projected/2b9b43ee-d217-4d73-8029-176c01146473-kube-api-access-9fcmj\") pod \"627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl\" (UID: \"2b9b43ee-d217-4d73-8029-176c01146473\") " pod="openstack-operators/627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl" Feb 28 04:28:50 crc kubenswrapper[5072]: I0228 04:28:50.730513 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b9b43ee-d217-4d73-8029-176c01146473-bundle\") pod \"627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl\" (UID: \"2b9b43ee-d217-4d73-8029-176c01146473\") " pod="openstack-operators/627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl" Feb 28 04:28:50 crc kubenswrapper[5072]: I0228 04:28:50.832387 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b9b43ee-d217-4d73-8029-176c01146473-util\") pod \"627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl\" (UID: \"2b9b43ee-d217-4d73-8029-176c01146473\") " pod="openstack-operators/627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl" Feb 28 04:28:50 crc kubenswrapper[5072]: I0228 04:28:50.832499 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fcmj\" (UniqueName: \"kubernetes.io/projected/2b9b43ee-d217-4d73-8029-176c01146473-kube-api-access-9fcmj\") pod \"627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl\" (UID: \"2b9b43ee-d217-4d73-8029-176c01146473\") " pod="openstack-operators/627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl" Feb 28 04:28:50 crc kubenswrapper[5072]: I0228 04:28:50.832551 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b9b43ee-d217-4d73-8029-176c01146473-bundle\") pod \"627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl\" (UID: \"2b9b43ee-d217-4d73-8029-176c01146473\") " pod="openstack-operators/627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl" Feb 28 04:28:50 crc kubenswrapper[5072]: I0228 04:28:50.833200 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b9b43ee-d217-4d73-8029-176c01146473-bundle\") pod \"627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl\" (UID: \"2b9b43ee-d217-4d73-8029-176c01146473\") " pod="openstack-operators/627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl" Feb 28 04:28:50 crc kubenswrapper[5072]: I0228 04:28:50.833202 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b9b43ee-d217-4d73-8029-176c01146473-util\") pod \"627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl\" (UID: \"2b9b43ee-d217-4d73-8029-176c01146473\") " pod="openstack-operators/627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl" Feb 28 04:28:50 crc kubenswrapper[5072]: I0228 04:28:50.853418 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fcmj\" (UniqueName: \"kubernetes.io/projected/2b9b43ee-d217-4d73-8029-176c01146473-kube-api-access-9fcmj\") pod \"627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl\" (UID: \"2b9b43ee-d217-4d73-8029-176c01146473\") " pod="openstack-operators/627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl" Feb 28 04:28:50 crc kubenswrapper[5072]: I0228 04:28:50.897480 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl" Feb 28 04:28:51 crc kubenswrapper[5072]: I0228 04:28:51.340983 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl"] Feb 28 04:28:52 crc kubenswrapper[5072]: I0228 04:28:52.150078 5072 generic.go:334] "Generic (PLEG): container finished" podID="2b9b43ee-d217-4d73-8029-176c01146473" containerID="4683ec73a7f6874e4e5185b13be360203fa99747a921ac6c540975beca0ed9f5" exitCode=0 Feb 28 04:28:52 crc kubenswrapper[5072]: I0228 04:28:52.150153 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl" event={"ID":"2b9b43ee-d217-4d73-8029-176c01146473","Type":"ContainerDied","Data":"4683ec73a7f6874e4e5185b13be360203fa99747a921ac6c540975beca0ed9f5"} Feb 28 04:28:52 crc kubenswrapper[5072]: I0228 04:28:52.150182 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl" event={"ID":"2b9b43ee-d217-4d73-8029-176c01146473","Type":"ContainerStarted","Data":"d3b988c0b6993518ee790a1baad317aec751b7016dadc021773980a3afc18bb2"} Feb 28 04:28:52 crc kubenswrapper[5072]: I0228 04:28:52.151529 5072 generic.go:334] "Generic (PLEG): container finished" podID="f489d842-f259-4a58-af65-16f8d04dfa07" containerID="0cf647178618894fa41d0aabb4dd23ee31971bc7401ba75c71d69b60e4e65c83" exitCode=0 Feb 28 04:28:52 crc kubenswrapper[5072]: I0228 04:28:52.151563 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-bootstrap-gffh8" event={"ID":"f489d842-f259-4a58-af65-16f8d04dfa07","Type":"ContainerDied","Data":"0cf647178618894fa41d0aabb4dd23ee31971bc7401ba75c71d69b60e4e65c83"} Feb 28 04:28:53 crc kubenswrapper[5072]: I0228 04:28:53.160530 5072 generic.go:334] "Generic (PLEG): container finished" podID="2b9b43ee-d217-4d73-8029-176c01146473" containerID="0753d3168a45c1b43d11c6abbc0f2d6e050f3147149778501091ade7e9988331" exitCode=0 Feb 28 04:28:53 crc kubenswrapper[5072]: I0228 04:28:53.160581 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl" event={"ID":"2b9b43ee-d217-4d73-8029-176c01146473","Type":"ContainerDied","Data":"0753d3168a45c1b43d11c6abbc0f2d6e050f3147149778501091ade7e9988331"} Feb 28 04:28:53 crc kubenswrapper[5072]: I0228 04:28:53.494204 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-bootstrap-gffh8" Feb 28 04:28:53 crc kubenswrapper[5072]: I0228 04:28:53.679141 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f489d842-f259-4a58-af65-16f8d04dfa07-scripts\") pod \"f489d842-f259-4a58-af65-16f8d04dfa07\" (UID: \"f489d842-f259-4a58-af65-16f8d04dfa07\") " Feb 28 04:28:53 crc kubenswrapper[5072]: I0228 04:28:53.679276 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f489d842-f259-4a58-af65-16f8d04dfa07-config-data\") pod \"f489d842-f259-4a58-af65-16f8d04dfa07\" (UID: \"f489d842-f259-4a58-af65-16f8d04dfa07\") " Feb 28 04:28:53 crc kubenswrapper[5072]: I0228 04:28:53.679351 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f489d842-f259-4a58-af65-16f8d04dfa07-fernet-keys\") pod \"f489d842-f259-4a58-af65-16f8d04dfa07\" (UID: \"f489d842-f259-4a58-af65-16f8d04dfa07\") " Feb 28 04:28:53 crc kubenswrapper[5072]: I0228 04:28:53.679399 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f489d842-f259-4a58-af65-16f8d04dfa07-credential-keys\") pod \"f489d842-f259-4a58-af65-16f8d04dfa07\" (UID: \"f489d842-f259-4a58-af65-16f8d04dfa07\") " Feb 28 04:28:53 crc kubenswrapper[5072]: I0228 04:28:53.679433 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsb6q\" (UniqueName: \"kubernetes.io/projected/f489d842-f259-4a58-af65-16f8d04dfa07-kube-api-access-xsb6q\") pod \"f489d842-f259-4a58-af65-16f8d04dfa07\" (UID: \"f489d842-f259-4a58-af65-16f8d04dfa07\") " Feb 28 04:28:53 crc kubenswrapper[5072]: I0228 04:28:53.684704 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f489d842-f259-4a58-af65-16f8d04dfa07-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f489d842-f259-4a58-af65-16f8d04dfa07" (UID: "f489d842-f259-4a58-af65-16f8d04dfa07"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:28:53 crc kubenswrapper[5072]: I0228 04:28:53.685146 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f489d842-f259-4a58-af65-16f8d04dfa07-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f489d842-f259-4a58-af65-16f8d04dfa07" (UID: "f489d842-f259-4a58-af65-16f8d04dfa07"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:28:53 crc kubenswrapper[5072]: I0228 04:28:53.685394 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f489d842-f259-4a58-af65-16f8d04dfa07-scripts" (OuterVolumeSpecName: "scripts") pod "f489d842-f259-4a58-af65-16f8d04dfa07" (UID: "f489d842-f259-4a58-af65-16f8d04dfa07"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:28:53 crc kubenswrapper[5072]: I0228 04:28:53.685788 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f489d842-f259-4a58-af65-16f8d04dfa07-kube-api-access-xsb6q" (OuterVolumeSpecName: "kube-api-access-xsb6q") pod "f489d842-f259-4a58-af65-16f8d04dfa07" (UID: "f489d842-f259-4a58-af65-16f8d04dfa07"). InnerVolumeSpecName "kube-api-access-xsb6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:28:53 crc kubenswrapper[5072]: I0228 04:28:53.702289 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f489d842-f259-4a58-af65-16f8d04dfa07-config-data" (OuterVolumeSpecName: "config-data") pod "f489d842-f259-4a58-af65-16f8d04dfa07" (UID: "f489d842-f259-4a58-af65-16f8d04dfa07"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:28:53 crc kubenswrapper[5072]: I0228 04:28:53.781261 5072 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f489d842-f259-4a58-af65-16f8d04dfa07-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 28 04:28:53 crc kubenswrapper[5072]: I0228 04:28:53.781296 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsb6q\" (UniqueName: \"kubernetes.io/projected/f489d842-f259-4a58-af65-16f8d04dfa07-kube-api-access-xsb6q\") on node \"crc\" DevicePath \"\"" Feb 28 04:28:53 crc kubenswrapper[5072]: I0228 04:28:53.781308 5072 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f489d842-f259-4a58-af65-16f8d04dfa07-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 04:28:53 crc kubenswrapper[5072]: I0228 04:28:53.781318 5072 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f489d842-f259-4a58-af65-16f8d04dfa07-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 04:28:53 crc kubenswrapper[5072]: I0228 04:28:53.781326 5072 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f489d842-f259-4a58-af65-16f8d04dfa07-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 28 04:28:54 crc kubenswrapper[5072]: I0228 04:28:54.172153 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-bootstrap-gffh8" event={"ID":"f489d842-f259-4a58-af65-16f8d04dfa07","Type":"ContainerDied","Data":"aec823341acb2f76c73183577bebf51a7b06862b90dafca7ba46a99f81c6b7b6"} Feb 28 04:28:54 crc kubenswrapper[5072]: I0228 04:28:54.172178 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-bootstrap-gffh8" Feb 28 04:28:54 crc kubenswrapper[5072]: I0228 04:28:54.172551 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aec823341acb2f76c73183577bebf51a7b06862b90dafca7ba46a99f81c6b7b6" Feb 28 04:28:54 crc kubenswrapper[5072]: I0228 04:28:54.174546 5072 generic.go:334] "Generic (PLEG): container finished" podID="2b9b43ee-d217-4d73-8029-176c01146473" containerID="4d8a41aacb1e93112baac1a853fd5334e2d362789f3d02d90928e91c78a0e690" exitCode=0 Feb 28 04:28:54 crc kubenswrapper[5072]: I0228 04:28:54.174581 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl" event={"ID":"2b9b43ee-d217-4d73-8029-176c01146473","Type":"ContainerDied","Data":"4d8a41aacb1e93112baac1a853fd5334e2d362789f3d02d90928e91c78a0e690"} Feb 28 04:28:54 crc kubenswrapper[5072]: I0228 04:28:54.272105 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/keystone-577bcf6dcc-r9srn"] Feb 28 04:28:54 crc kubenswrapper[5072]: E0228 04:28:54.272456 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f489d842-f259-4a58-af65-16f8d04dfa07" containerName="keystone-bootstrap" Feb 28 04:28:54 crc kubenswrapper[5072]: I0228 04:28:54.272477 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="f489d842-f259-4a58-af65-16f8d04dfa07" containerName="keystone-bootstrap" Feb 28 04:28:54 crc kubenswrapper[5072]: I0228 04:28:54.272652 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="f489d842-f259-4a58-af65-16f8d04dfa07" containerName="keystone-bootstrap" Feb 28 04:28:54 crc kubenswrapper[5072]: I0228 04:28:54.273248 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-577bcf6dcc-r9srn" Feb 28 04:28:54 crc kubenswrapper[5072]: I0228 04:28:54.275249 5072 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone-scripts" Feb 28 04:28:54 crc kubenswrapper[5072]: I0228 04:28:54.275684 5072 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone-keystone-dockercfg-r2z6h" Feb 28 04:28:54 crc kubenswrapper[5072]: I0228 04:28:54.276913 5072 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone" Feb 28 04:28:54 crc kubenswrapper[5072]: I0228 04:28:54.277726 5072 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone-config-data" Feb 28 04:28:54 crc kubenswrapper[5072]: I0228 04:28:54.279861 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystone-577bcf6dcc-r9srn"] Feb 28 04:28:54 crc kubenswrapper[5072]: I0228 04:28:54.391289 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fab191f5-56a2-4b06-88be-14286e763b52-scripts\") pod \"keystone-577bcf6dcc-r9srn\" (UID: \"fab191f5-56a2-4b06-88be-14286e763b52\") " pod="horizon-kuttl-tests/keystone-577bcf6dcc-r9srn" Feb 28 04:28:54 crc kubenswrapper[5072]: I0228 04:28:54.391587 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fab191f5-56a2-4b06-88be-14286e763b52-credential-keys\") pod \"keystone-577bcf6dcc-r9srn\" (UID: \"fab191f5-56a2-4b06-88be-14286e763b52\") " pod="horizon-kuttl-tests/keystone-577bcf6dcc-r9srn" Feb 28 04:28:54 crc kubenswrapper[5072]: I0228 04:28:54.391711 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7c9m\" (UniqueName: \"kubernetes.io/projected/fab191f5-56a2-4b06-88be-14286e763b52-kube-api-access-v7c9m\") pod \"keystone-577bcf6dcc-r9srn\" (UID: \"fab191f5-56a2-4b06-88be-14286e763b52\") " pod="horizon-kuttl-tests/keystone-577bcf6dcc-r9srn" Feb 28 04:28:54 crc kubenswrapper[5072]: I0228 04:28:54.391848 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fab191f5-56a2-4b06-88be-14286e763b52-fernet-keys\") pod \"keystone-577bcf6dcc-r9srn\" (UID: \"fab191f5-56a2-4b06-88be-14286e763b52\") " pod="horizon-kuttl-tests/keystone-577bcf6dcc-r9srn" Feb 28 04:28:54 crc kubenswrapper[5072]: I0228 04:28:54.391998 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fab191f5-56a2-4b06-88be-14286e763b52-config-data\") pod \"keystone-577bcf6dcc-r9srn\" (UID: \"fab191f5-56a2-4b06-88be-14286e763b52\") " pod="horizon-kuttl-tests/keystone-577bcf6dcc-r9srn" Feb 28 04:28:54 crc kubenswrapper[5072]: I0228 04:28:54.493067 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fab191f5-56a2-4b06-88be-14286e763b52-fernet-keys\") pod \"keystone-577bcf6dcc-r9srn\" (UID: \"fab191f5-56a2-4b06-88be-14286e763b52\") " pod="horizon-kuttl-tests/keystone-577bcf6dcc-r9srn" Feb 28 04:28:54 crc kubenswrapper[5072]: I0228 04:28:54.493477 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fab191f5-56a2-4b06-88be-14286e763b52-config-data\") pod \"keystone-577bcf6dcc-r9srn\" (UID: \"fab191f5-56a2-4b06-88be-14286e763b52\") " pod="horizon-kuttl-tests/keystone-577bcf6dcc-r9srn" Feb 28 04:28:54 crc kubenswrapper[5072]: I0228 04:28:54.493602 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fab191f5-56a2-4b06-88be-14286e763b52-scripts\") pod \"keystone-577bcf6dcc-r9srn\" (UID: \"fab191f5-56a2-4b06-88be-14286e763b52\") " pod="horizon-kuttl-tests/keystone-577bcf6dcc-r9srn" Feb 28 04:28:54 crc kubenswrapper[5072]: I0228 04:28:54.493723 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7c9m\" (UniqueName: \"kubernetes.io/projected/fab191f5-56a2-4b06-88be-14286e763b52-kube-api-access-v7c9m\") pod \"keystone-577bcf6dcc-r9srn\" (UID: \"fab191f5-56a2-4b06-88be-14286e763b52\") " pod="horizon-kuttl-tests/keystone-577bcf6dcc-r9srn" Feb 28 04:28:54 crc kubenswrapper[5072]: I0228 04:28:54.493817 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fab191f5-56a2-4b06-88be-14286e763b52-credential-keys\") pod \"keystone-577bcf6dcc-r9srn\" (UID: \"fab191f5-56a2-4b06-88be-14286e763b52\") " pod="horizon-kuttl-tests/keystone-577bcf6dcc-r9srn" Feb 28 04:28:54 crc kubenswrapper[5072]: I0228 04:28:54.500052 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fab191f5-56a2-4b06-88be-14286e763b52-config-data\") pod \"keystone-577bcf6dcc-r9srn\" (UID: \"fab191f5-56a2-4b06-88be-14286e763b52\") " pod="horizon-kuttl-tests/keystone-577bcf6dcc-r9srn" Feb 28 04:28:54 crc kubenswrapper[5072]: I0228 04:28:54.501059 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fab191f5-56a2-4b06-88be-14286e763b52-scripts\") pod \"keystone-577bcf6dcc-r9srn\" (UID: \"fab191f5-56a2-4b06-88be-14286e763b52\") " pod="horizon-kuttl-tests/keystone-577bcf6dcc-r9srn" Feb 28 04:28:54 crc kubenswrapper[5072]: I0228 04:28:54.502238 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fab191f5-56a2-4b06-88be-14286e763b52-credential-keys\") pod \"keystone-577bcf6dcc-r9srn\" (UID: \"fab191f5-56a2-4b06-88be-14286e763b52\") " pod="horizon-kuttl-tests/keystone-577bcf6dcc-r9srn" Feb 28 04:28:54 crc kubenswrapper[5072]: I0228 04:28:54.518707 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7c9m\" (UniqueName: \"kubernetes.io/projected/fab191f5-56a2-4b06-88be-14286e763b52-kube-api-access-v7c9m\") pod \"keystone-577bcf6dcc-r9srn\" (UID: \"fab191f5-56a2-4b06-88be-14286e763b52\") " pod="horizon-kuttl-tests/keystone-577bcf6dcc-r9srn" Feb 28 04:28:54 crc kubenswrapper[5072]: I0228 04:28:54.523985 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fab191f5-56a2-4b06-88be-14286e763b52-fernet-keys\") pod \"keystone-577bcf6dcc-r9srn\" (UID: \"fab191f5-56a2-4b06-88be-14286e763b52\") " pod="horizon-kuttl-tests/keystone-577bcf6dcc-r9srn" Feb 28 04:28:54 crc kubenswrapper[5072]: I0228 04:28:54.688132 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-577bcf6dcc-r9srn" Feb 28 04:28:55 crc kubenswrapper[5072]: I0228 04:28:55.142654 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystone-577bcf6dcc-r9srn"] Feb 28 04:28:55 crc kubenswrapper[5072]: I0228 04:28:55.181125 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-577bcf6dcc-r9srn" event={"ID":"fab191f5-56a2-4b06-88be-14286e763b52","Type":"ContainerStarted","Data":"6e2686e56a3e7d735e0b953e61784ab5586e46342f298e2d62b0bfb93f7da99b"} Feb 28 04:28:55 crc kubenswrapper[5072]: I0228 04:28:55.384114 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl" Feb 28 04:28:55 crc kubenswrapper[5072]: I0228 04:28:55.512857 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b9b43ee-d217-4d73-8029-176c01146473-bundle\") pod \"2b9b43ee-d217-4d73-8029-176c01146473\" (UID: \"2b9b43ee-d217-4d73-8029-176c01146473\") " Feb 28 04:28:55 crc kubenswrapper[5072]: I0228 04:28:55.512982 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fcmj\" (UniqueName: \"kubernetes.io/projected/2b9b43ee-d217-4d73-8029-176c01146473-kube-api-access-9fcmj\") pod \"2b9b43ee-d217-4d73-8029-176c01146473\" (UID: \"2b9b43ee-d217-4d73-8029-176c01146473\") " Feb 28 04:28:55 crc kubenswrapper[5072]: I0228 04:28:55.513097 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b9b43ee-d217-4d73-8029-176c01146473-util\") pod \"2b9b43ee-d217-4d73-8029-176c01146473\" (UID: \"2b9b43ee-d217-4d73-8029-176c01146473\") " Feb 28 04:28:55 crc kubenswrapper[5072]: I0228 04:28:55.513733 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b9b43ee-d217-4d73-8029-176c01146473-bundle" (OuterVolumeSpecName: "bundle") pod "2b9b43ee-d217-4d73-8029-176c01146473" (UID: "2b9b43ee-d217-4d73-8029-176c01146473"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:28:55 crc kubenswrapper[5072]: I0228 04:28:55.518971 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b9b43ee-d217-4d73-8029-176c01146473-kube-api-access-9fcmj" (OuterVolumeSpecName: "kube-api-access-9fcmj") pod "2b9b43ee-d217-4d73-8029-176c01146473" (UID: "2b9b43ee-d217-4d73-8029-176c01146473"). InnerVolumeSpecName "kube-api-access-9fcmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:28:55 crc kubenswrapper[5072]: I0228 04:28:55.528558 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b9b43ee-d217-4d73-8029-176c01146473-util" (OuterVolumeSpecName: "util") pod "2b9b43ee-d217-4d73-8029-176c01146473" (UID: "2b9b43ee-d217-4d73-8029-176c01146473"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:28:55 crc kubenswrapper[5072]: I0228 04:28:55.614553 5072 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b9b43ee-d217-4d73-8029-176c01146473-util\") on node \"crc\" DevicePath \"\"" Feb 28 04:28:55 crc kubenswrapper[5072]: I0228 04:28:55.614598 5072 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b9b43ee-d217-4d73-8029-176c01146473-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 04:28:55 crc kubenswrapper[5072]: I0228 04:28:55.614611 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fcmj\" (UniqueName: \"kubernetes.io/projected/2b9b43ee-d217-4d73-8029-176c01146473-kube-api-access-9fcmj\") on node \"crc\" DevicePath \"\"" Feb 28 04:28:56 crc kubenswrapper[5072]: I0228 04:28:56.193146 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-577bcf6dcc-r9srn" event={"ID":"fab191f5-56a2-4b06-88be-14286e763b52","Type":"ContainerStarted","Data":"d9899680691ab4bc6ab96c216a2101cf16f665914dcec5c175418bc070483704"} Feb 28 04:28:56 crc kubenswrapper[5072]: I0228 04:28:56.193338 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/keystone-577bcf6dcc-r9srn" Feb 28 04:28:56 crc kubenswrapper[5072]: I0228 04:28:56.196689 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl" event={"ID":"2b9b43ee-d217-4d73-8029-176c01146473","Type":"ContainerDied","Data":"d3b988c0b6993518ee790a1baad317aec751b7016dadc021773980a3afc18bb2"} Feb 28 04:28:56 crc kubenswrapper[5072]: I0228 04:28:56.196730 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl" Feb 28 04:28:56 crc kubenswrapper[5072]: I0228 04:28:56.196750 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3b988c0b6993518ee790a1baad317aec751b7016dadc021773980a3afc18bb2" Feb 28 04:28:56 crc kubenswrapper[5072]: I0228 04:28:56.218063 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/keystone-577bcf6dcc-r9srn" podStartSLOduration=2.218040624 podStartE2EDuration="2.218040624s" podCreationTimestamp="2026-02-28 04:28:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:28:56.211155009 +0000 UTC m=+1158.205885221" watchObservedRunningTime="2026-02-28 04:28:56.218040624 +0000 UTC m=+1158.212770816" Feb 28 04:29:08 crc kubenswrapper[5072]: I0228 04:29:08.349563 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6c7c8d5cfd-hjt58"] Feb 28 04:29:08 crc kubenswrapper[5072]: E0228 04:29:08.350598 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9b43ee-d217-4d73-8029-176c01146473" containerName="extract" Feb 28 04:29:08 crc kubenswrapper[5072]: I0228 04:29:08.350618 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9b43ee-d217-4d73-8029-176c01146473" containerName="extract" Feb 28 04:29:08 crc kubenswrapper[5072]: E0228 04:29:08.350665 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9b43ee-d217-4d73-8029-176c01146473" containerName="pull" Feb 28 04:29:08 crc kubenswrapper[5072]: I0228 04:29:08.350675 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9b43ee-d217-4d73-8029-176c01146473" containerName="pull" Feb 28 04:29:08 crc kubenswrapper[5072]: E0228 04:29:08.350693 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9b43ee-d217-4d73-8029-176c01146473" containerName="util" Feb 28 04:29:08 crc kubenswrapper[5072]: I0228 04:29:08.350701 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9b43ee-d217-4d73-8029-176c01146473" containerName="util" Feb 28 04:29:08 crc kubenswrapper[5072]: I0228 04:29:08.350860 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b9b43ee-d217-4d73-8029-176c01146473" containerName="extract" Feb 28 04:29:08 crc kubenswrapper[5072]: I0228 04:29:08.351451 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6c7c8d5cfd-hjt58" Feb 28 04:29:08 crc kubenswrapper[5072]: I0228 04:29:08.353494 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-9cmlr" Feb 28 04:29:08 crc kubenswrapper[5072]: I0228 04:29:08.353810 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-service-cert" Feb 28 04:29:08 crc kubenswrapper[5072]: I0228 04:29:08.370264 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6c7c8d5cfd-hjt58"] Feb 28 04:29:08 crc kubenswrapper[5072]: I0228 04:29:08.494447 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdhh2\" (UniqueName: \"kubernetes.io/projected/ba36008b-f798-4c99-bb4a-684f98897de8-kube-api-access-sdhh2\") pod \"horizon-operator-controller-manager-6c7c8d5cfd-hjt58\" (UID: \"ba36008b-f798-4c99-bb4a-684f98897de8\") " pod="openstack-operators/horizon-operator-controller-manager-6c7c8d5cfd-hjt58" Feb 28 04:29:08 crc kubenswrapper[5072]: I0228 04:29:08.494539 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ba36008b-f798-4c99-bb4a-684f98897de8-webhook-cert\") pod \"horizon-operator-controller-manager-6c7c8d5cfd-hjt58\" (UID: \"ba36008b-f798-4c99-bb4a-684f98897de8\") " pod="openstack-operators/horizon-operator-controller-manager-6c7c8d5cfd-hjt58" Feb 28 04:29:08 crc kubenswrapper[5072]: I0228 04:29:08.494977 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ba36008b-f798-4c99-bb4a-684f98897de8-apiservice-cert\") pod \"horizon-operator-controller-manager-6c7c8d5cfd-hjt58\" (UID: \"ba36008b-f798-4c99-bb4a-684f98897de8\") " pod="openstack-operators/horizon-operator-controller-manager-6c7c8d5cfd-hjt58" Feb 28 04:29:08 crc kubenswrapper[5072]: I0228 04:29:08.596114 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdhh2\" (UniqueName: \"kubernetes.io/projected/ba36008b-f798-4c99-bb4a-684f98897de8-kube-api-access-sdhh2\") pod \"horizon-operator-controller-manager-6c7c8d5cfd-hjt58\" (UID: \"ba36008b-f798-4c99-bb4a-684f98897de8\") " pod="openstack-operators/horizon-operator-controller-manager-6c7c8d5cfd-hjt58" Feb 28 04:29:08 crc kubenswrapper[5072]: I0228 04:29:08.596166 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ba36008b-f798-4c99-bb4a-684f98897de8-webhook-cert\") pod \"horizon-operator-controller-manager-6c7c8d5cfd-hjt58\" (UID: \"ba36008b-f798-4c99-bb4a-684f98897de8\") " pod="openstack-operators/horizon-operator-controller-manager-6c7c8d5cfd-hjt58" Feb 28 04:29:08 crc kubenswrapper[5072]: I0228 04:29:08.596229 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ba36008b-f798-4c99-bb4a-684f98897de8-apiservice-cert\") pod \"horizon-operator-controller-manager-6c7c8d5cfd-hjt58\" (UID: \"ba36008b-f798-4c99-bb4a-684f98897de8\") " pod="openstack-operators/horizon-operator-controller-manager-6c7c8d5cfd-hjt58" Feb 28 04:29:08 crc kubenswrapper[5072]: I0228 04:29:08.602883 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ba36008b-f798-4c99-bb4a-684f98897de8-webhook-cert\") pod \"horizon-operator-controller-manager-6c7c8d5cfd-hjt58\" (UID: \"ba36008b-f798-4c99-bb4a-684f98897de8\") " pod="openstack-operators/horizon-operator-controller-manager-6c7c8d5cfd-hjt58" Feb 28 04:29:08 crc kubenswrapper[5072]: I0228 04:29:08.603232 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ba36008b-f798-4c99-bb4a-684f98897de8-apiservice-cert\") pod \"horizon-operator-controller-manager-6c7c8d5cfd-hjt58\" (UID: \"ba36008b-f798-4c99-bb4a-684f98897de8\") " pod="openstack-operators/horizon-operator-controller-manager-6c7c8d5cfd-hjt58" Feb 28 04:29:08 crc kubenswrapper[5072]: I0228 04:29:08.612503 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdhh2\" (UniqueName: \"kubernetes.io/projected/ba36008b-f798-4c99-bb4a-684f98897de8-kube-api-access-sdhh2\") pod \"horizon-operator-controller-manager-6c7c8d5cfd-hjt58\" (UID: \"ba36008b-f798-4c99-bb4a-684f98897de8\") " pod="openstack-operators/horizon-operator-controller-manager-6c7c8d5cfd-hjt58" Feb 28 04:29:08 crc kubenswrapper[5072]: I0228 04:29:08.672413 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6c7c8d5cfd-hjt58" Feb 28 04:29:09 crc kubenswrapper[5072]: I0228 04:29:09.122780 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6c7c8d5cfd-hjt58"] Feb 28 04:29:09 crc kubenswrapper[5072]: I0228 04:29:09.305578 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6c7c8d5cfd-hjt58" event={"ID":"ba36008b-f798-4c99-bb4a-684f98897de8","Type":"ContainerStarted","Data":"0be814b6a25b4953055eafc33d110d0f5809d2b0666a2a80939498a563f18f41"} Feb 28 04:29:12 crc kubenswrapper[5072]: I0228 04:29:12.327576 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6c7c8d5cfd-hjt58" event={"ID":"ba36008b-f798-4c99-bb4a-684f98897de8","Type":"ContainerStarted","Data":"ee773408be9a18b6a91dab47e9145ee0ac6cb6b2e5f91475156fb5be304a679f"} Feb 28 04:29:12 crc kubenswrapper[5072]: I0228 04:29:12.328291 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6c7c8d5cfd-hjt58" Feb 28 04:29:18 crc kubenswrapper[5072]: I0228 04:29:18.677810 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6c7c8d5cfd-hjt58" Feb 28 04:29:18 crc kubenswrapper[5072]: I0228 04:29:18.705948 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6c7c8d5cfd-hjt58" podStartSLOduration=8.104387472 podStartE2EDuration="10.705923977s" podCreationTimestamp="2026-02-28 04:29:08 +0000 UTC" firstStartedPulling="2026-02-28 04:29:09.131061477 +0000 UTC m=+1171.125791669" lastFinishedPulling="2026-02-28 04:29:11.732597982 +0000 UTC m=+1173.727328174" observedRunningTime="2026-02-28 04:29:12.347756623 +0000 UTC m=+1174.342486825" watchObservedRunningTime="2026-02-28 04:29:18.705923977 +0000 UTC m=+1180.700654169" Feb 28 04:29:20 crc kubenswrapper[5072]: I0228 04:29:20.105909 5072 patch_prober.go:28] interesting pod/machine-config-daemon-5lrpf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:29:20 crc kubenswrapper[5072]: I0228 04:29:20.106216 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:29:23 crc kubenswrapper[5072]: I0228 04:29:23.828355 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/horizon-6675bd755-28f6k"] Feb 28 04:29:23 crc kubenswrapper[5072]: I0228 04:29:23.830219 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" Feb 28 04:29:23 crc kubenswrapper[5072]: I0228 04:29:23.831831 5072 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"horizon" Feb 28 04:29:23 crc kubenswrapper[5072]: I0228 04:29:23.832891 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"horizon-scripts" Feb 28 04:29:23 crc kubenswrapper[5072]: I0228 04:29:23.834611 5072 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"horizon-horizon-dockercfg-2jcst" Feb 28 04:29:23 crc kubenswrapper[5072]: I0228 04:29:23.834989 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"horizon-config-data" Feb 28 04:29:23 crc kubenswrapper[5072]: I0228 04:29:23.839432 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/horizon-6675bd755-28f6k"] Feb 28 04:29:23 crc kubenswrapper[5072]: I0228 04:29:23.914200 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b-scripts\") pod \"horizon-6675bd755-28f6k\" (UID: \"73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b\") " pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" Feb 28 04:29:23 crc kubenswrapper[5072]: I0228 04:29:23.914244 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b-config-data\") pod \"horizon-6675bd755-28f6k\" (UID: \"73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b\") " pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" Feb 28 04:29:23 crc kubenswrapper[5072]: I0228 04:29:23.914267 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7xjm\" (UniqueName: \"kubernetes.io/projected/73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b-kube-api-access-q7xjm\") pod \"horizon-6675bd755-28f6k\" (UID: \"73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b\") " pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" Feb 28 04:29:23 crc kubenswrapper[5072]: I0228 04:29:23.914296 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b-horizon-secret-key\") pod \"horizon-6675bd755-28f6k\" (UID: \"73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b\") " pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" Feb 28 04:29:23 crc kubenswrapper[5072]: I0228 04:29:23.914319 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b-logs\") pod \"horizon-6675bd755-28f6k\" (UID: \"73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b\") " pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" Feb 28 04:29:23 crc kubenswrapper[5072]: I0228 04:29:23.934452 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/horizon-8bb8556c5-jgg85"] Feb 28 04:29:23 crc kubenswrapper[5072]: I0228 04:29:23.935794 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" Feb 28 04:29:23 crc kubenswrapper[5072]: I0228 04:29:23.943091 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/horizon-8bb8556c5-jgg85"] Feb 28 04:29:24 crc kubenswrapper[5072]: I0228 04:29:24.015628 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fca5f8af-9afd-47d4-8a2a-009df3216bc0-config-data\") pod \"horizon-8bb8556c5-jgg85\" (UID: \"fca5f8af-9afd-47d4-8a2a-009df3216bc0\") " pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" Feb 28 04:29:24 crc kubenswrapper[5072]: I0228 04:29:24.015745 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7xjm\" (UniqueName: \"kubernetes.io/projected/73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b-kube-api-access-q7xjm\") pod \"horizon-6675bd755-28f6k\" (UID: \"73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b\") " pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" Feb 28 04:29:24 crc kubenswrapper[5072]: I0228 04:29:24.015808 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b-horizon-secret-key\") pod \"horizon-6675bd755-28f6k\" (UID: \"73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b\") " pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" Feb 28 04:29:24 crc kubenswrapper[5072]: I0228 04:29:24.015857 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b-logs\") pod \"horizon-6675bd755-28f6k\" (UID: \"73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b\") " pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" Feb 28 04:29:24 crc kubenswrapper[5072]: I0228 04:29:24.015892 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fca5f8af-9afd-47d4-8a2a-009df3216bc0-horizon-secret-key\") pod \"horizon-8bb8556c5-jgg85\" (UID: \"fca5f8af-9afd-47d4-8a2a-009df3216bc0\") " pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" Feb 28 04:29:24 crc kubenswrapper[5072]: I0228 04:29:24.015963 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fca5f8af-9afd-47d4-8a2a-009df3216bc0-logs\") pod \"horizon-8bb8556c5-jgg85\" (UID: \"fca5f8af-9afd-47d4-8a2a-009df3216bc0\") " pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" Feb 28 04:29:24 crc kubenswrapper[5072]: I0228 04:29:24.016102 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96fm4\" (UniqueName: \"kubernetes.io/projected/fca5f8af-9afd-47d4-8a2a-009df3216bc0-kube-api-access-96fm4\") pod \"horizon-8bb8556c5-jgg85\" (UID: \"fca5f8af-9afd-47d4-8a2a-009df3216bc0\") " pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" Feb 28 04:29:24 crc kubenswrapper[5072]: I0228 04:29:24.016177 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b-scripts\") pod \"horizon-6675bd755-28f6k\" (UID: \"73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b\") " pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" Feb 28 04:29:24 crc kubenswrapper[5072]: I0228 04:29:24.016226 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b-config-data\") pod \"horizon-6675bd755-28f6k\" (UID: \"73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b\") " pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" Feb 28 04:29:24 crc kubenswrapper[5072]: I0228 04:29:24.016258 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fca5f8af-9afd-47d4-8a2a-009df3216bc0-scripts\") pod \"horizon-8bb8556c5-jgg85\" (UID: \"fca5f8af-9afd-47d4-8a2a-009df3216bc0\") " pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" Feb 28 04:29:24 crc kubenswrapper[5072]: I0228 04:29:24.016352 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b-logs\") pod \"horizon-6675bd755-28f6k\" (UID: \"73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b\") " pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" Feb 28 04:29:24 crc kubenswrapper[5072]: I0228 04:29:24.017318 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b-scripts\") pod \"horizon-6675bd755-28f6k\" (UID: \"73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b\") " pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" Feb 28 04:29:24 crc kubenswrapper[5072]: I0228 04:29:24.017393 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b-config-data\") pod \"horizon-6675bd755-28f6k\" (UID: \"73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b\") " pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" Feb 28 04:29:24 crc kubenswrapper[5072]: I0228 04:29:24.020980 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b-horizon-secret-key\") pod \"horizon-6675bd755-28f6k\" (UID: \"73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b\") " pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" Feb 28 04:29:24 crc kubenswrapper[5072]: I0228 04:29:24.032790 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7xjm\" (UniqueName: \"kubernetes.io/projected/73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b-kube-api-access-q7xjm\") pod \"horizon-6675bd755-28f6k\" (UID: \"73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b\") " pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" Feb 28 04:29:24 crc kubenswrapper[5072]: I0228 04:29:24.117209 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fca5f8af-9afd-47d4-8a2a-009df3216bc0-horizon-secret-key\") pod \"horizon-8bb8556c5-jgg85\" (UID: \"fca5f8af-9afd-47d4-8a2a-009df3216bc0\") " pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" Feb 28 04:29:24 crc kubenswrapper[5072]: I0228 04:29:24.117308 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fca5f8af-9afd-47d4-8a2a-009df3216bc0-logs\") pod \"horizon-8bb8556c5-jgg85\" (UID: \"fca5f8af-9afd-47d4-8a2a-009df3216bc0\") " pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" Feb 28 04:29:24 crc kubenswrapper[5072]: I0228 04:29:24.117357 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96fm4\" (UniqueName: \"kubernetes.io/projected/fca5f8af-9afd-47d4-8a2a-009df3216bc0-kube-api-access-96fm4\") pod \"horizon-8bb8556c5-jgg85\" (UID: \"fca5f8af-9afd-47d4-8a2a-009df3216bc0\") " pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" Feb 28 04:29:24 crc kubenswrapper[5072]: I0228 04:29:24.117393 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fca5f8af-9afd-47d4-8a2a-009df3216bc0-scripts\") pod \"horizon-8bb8556c5-jgg85\" (UID: \"fca5f8af-9afd-47d4-8a2a-009df3216bc0\") " pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" Feb 28 04:29:24 crc kubenswrapper[5072]: I0228 04:29:24.117411 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fca5f8af-9afd-47d4-8a2a-009df3216bc0-config-data\") pod \"horizon-8bb8556c5-jgg85\" (UID: \"fca5f8af-9afd-47d4-8a2a-009df3216bc0\") " pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" Feb 28 04:29:24 crc kubenswrapper[5072]: I0228 04:29:24.118780 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fca5f8af-9afd-47d4-8a2a-009df3216bc0-logs\") pod \"horizon-8bb8556c5-jgg85\" (UID: \"fca5f8af-9afd-47d4-8a2a-009df3216bc0\") " pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" Feb 28 04:29:24 crc kubenswrapper[5072]: I0228 04:29:24.118823 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fca5f8af-9afd-47d4-8a2a-009df3216bc0-config-data\") pod \"horizon-8bb8556c5-jgg85\" (UID: \"fca5f8af-9afd-47d4-8a2a-009df3216bc0\") " pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" Feb 28 04:29:24 crc kubenswrapper[5072]: I0228 04:29:24.118977 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fca5f8af-9afd-47d4-8a2a-009df3216bc0-scripts\") pod \"horizon-8bb8556c5-jgg85\" (UID: \"fca5f8af-9afd-47d4-8a2a-009df3216bc0\") " pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" Feb 28 04:29:24 crc kubenswrapper[5072]: I0228 04:29:24.121021 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fca5f8af-9afd-47d4-8a2a-009df3216bc0-horizon-secret-key\") pod \"horizon-8bb8556c5-jgg85\" (UID: \"fca5f8af-9afd-47d4-8a2a-009df3216bc0\") " pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" Feb 28 04:29:24 crc kubenswrapper[5072]: I0228 04:29:24.134477 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96fm4\" (UniqueName: \"kubernetes.io/projected/fca5f8af-9afd-47d4-8a2a-009df3216bc0-kube-api-access-96fm4\") pod \"horizon-8bb8556c5-jgg85\" (UID: \"fca5f8af-9afd-47d4-8a2a-009df3216bc0\") " pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" Feb 28 04:29:24 crc kubenswrapper[5072]: I0228 04:29:24.163054 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" Feb 28 04:29:24 crc kubenswrapper[5072]: I0228 04:29:24.259787 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" Feb 28 04:29:24 crc kubenswrapper[5072]: I0228 04:29:24.487278 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/horizon-8bb8556c5-jgg85"] Feb 28 04:29:24 crc kubenswrapper[5072]: W0228 04:29:24.492195 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfca5f8af_9afd_47d4_8a2a_009df3216bc0.slice/crio-7f055a30bd2a43f11106e9030642daaca51b54626dfe5a6aa3e4298e15d9adfa WatchSource:0}: Error finding container 7f055a30bd2a43f11106e9030642daaca51b54626dfe5a6aa3e4298e15d9adfa: Status 404 returned error can't find the container with id 7f055a30bd2a43f11106e9030642daaca51b54626dfe5a6aa3e4298e15d9adfa Feb 28 04:29:24 crc kubenswrapper[5072]: I0228 04:29:24.558125 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/horizon-6675bd755-28f6k"] Feb 28 04:29:24 crc kubenswrapper[5072]: W0228 04:29:24.562283 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73c98e9f_1dbd_4ff9_97a8_0d5e1c9b567b.slice/crio-2894b8fbcb7627c16db881665476b6cac64f33dd8306a09116b465dae9022e6e WatchSource:0}: Error finding container 2894b8fbcb7627c16db881665476b6cac64f33dd8306a09116b465dae9022e6e: Status 404 returned error can't find the container with id 2894b8fbcb7627c16db881665476b6cac64f33dd8306a09116b465dae9022e6e Feb 28 04:29:25 crc kubenswrapper[5072]: I0228 04:29:25.437951 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" event={"ID":"73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b","Type":"ContainerStarted","Data":"2894b8fbcb7627c16db881665476b6cac64f33dd8306a09116b465dae9022e6e"} Feb 28 04:29:25 crc kubenswrapper[5072]: I0228 04:29:25.439014 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" event={"ID":"fca5f8af-9afd-47d4-8a2a-009df3216bc0","Type":"ContainerStarted","Data":"7f055a30bd2a43f11106e9030642daaca51b54626dfe5a6aa3e4298e15d9adfa"} Feb 28 04:29:26 crc kubenswrapper[5072]: I0228 04:29:26.829671 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="horizon-kuttl-tests/keystone-577bcf6dcc-r9srn" Feb 28 04:29:33 crc kubenswrapper[5072]: I0228 04:29:33.509992 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" event={"ID":"73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b","Type":"ContainerStarted","Data":"5ebd71f81f76ab935aa0d5b963704646705e6675f6923ea8c34c394a37fcc5bd"} Feb 28 04:29:33 crc kubenswrapper[5072]: I0228 04:29:33.510481 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" event={"ID":"73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b","Type":"ContainerStarted","Data":"96579f554c511ece64d1cad5dc1888a16733204ec0a7f9bef12c790d1494bd05"} Feb 28 04:29:33 crc kubenswrapper[5072]: I0228 04:29:33.511944 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" event={"ID":"fca5f8af-9afd-47d4-8a2a-009df3216bc0","Type":"ContainerStarted","Data":"041ad997be384f6c506ce0ab049948cfac3e3b9949fb73e9da6bd2499a3e7d3f"} Feb 28 04:29:33 crc kubenswrapper[5072]: I0228 04:29:33.511990 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" event={"ID":"fca5f8af-9afd-47d4-8a2a-009df3216bc0","Type":"ContainerStarted","Data":"208a6b0bee8e5b454dc20ee825540d3e859f5febb9d06ce6546a7ae95b659b2e"} Feb 28 04:29:33 crc kubenswrapper[5072]: I0228 04:29:33.527021 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" podStartSLOduration=2.704139164 podStartE2EDuration="10.527006867s" podCreationTimestamp="2026-02-28 04:29:23 +0000 UTC" firstStartedPulling="2026-02-28 04:29:24.565995788 +0000 UTC m=+1186.560726000" lastFinishedPulling="2026-02-28 04:29:32.388863511 +0000 UTC m=+1194.383593703" observedRunningTime="2026-02-28 04:29:33.526133859 +0000 UTC m=+1195.520864051" watchObservedRunningTime="2026-02-28 04:29:33.527006867 +0000 UTC m=+1195.521737059" Feb 28 04:29:33 crc kubenswrapper[5072]: I0228 04:29:33.547189 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" podStartSLOduration=2.633143998 podStartE2EDuration="10.547172619s" podCreationTimestamp="2026-02-28 04:29:23 +0000 UTC" firstStartedPulling="2026-02-28 04:29:24.494174946 +0000 UTC m=+1186.488905158" lastFinishedPulling="2026-02-28 04:29:32.408203587 +0000 UTC m=+1194.402933779" observedRunningTime="2026-02-28 04:29:33.545506486 +0000 UTC m=+1195.540236678" watchObservedRunningTime="2026-02-28 04:29:33.547172619 +0000 UTC m=+1195.541902801" Feb 28 04:29:34 crc kubenswrapper[5072]: I0228 04:29:34.163978 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" Feb 28 04:29:34 crc kubenswrapper[5072]: I0228 04:29:34.164267 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" Feb 28 04:29:34 crc kubenswrapper[5072]: I0228 04:29:34.261253 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" Feb 28 04:29:34 crc kubenswrapper[5072]: I0228 04:29:34.261301 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" Feb 28 04:29:44 crc kubenswrapper[5072]: I0228 04:29:44.165055 5072 prober.go:107] "Probe failed" probeType="Startup" pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" podUID="73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b" containerName="horizon" probeResult="failure" output="Get \"http://10.217.0.87:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.87:8080: connect: connection refused" Feb 28 04:29:44 crc kubenswrapper[5072]: I0228 04:29:44.262854 5072 prober.go:107] "Probe failed" probeType="Startup" pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" podUID="fca5f8af-9afd-47d4-8a2a-009df3216bc0" containerName="horizon" probeResult="failure" output="Get \"http://10.217.0.88:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.88:8080: connect: connection refused" Feb 28 04:29:50 crc kubenswrapper[5072]: I0228 04:29:50.106016 5072 patch_prober.go:28] interesting pod/machine-config-daemon-5lrpf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:29:50 crc kubenswrapper[5072]: I0228 04:29:50.106419 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:29:56 crc kubenswrapper[5072]: I0228 04:29:56.164908 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" Feb 28 04:29:56 crc kubenswrapper[5072]: I0228 04:29:56.190766 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" Feb 28 04:29:58 crc kubenswrapper[5072]: I0228 04:29:58.094662 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" Feb 28 04:29:58 crc kubenswrapper[5072]: I0228 04:29:58.202746 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" Feb 28 04:29:58 crc kubenswrapper[5072]: I0228 04:29:58.258892 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/horizon-6675bd755-28f6k"] Feb 28 04:29:58 crc kubenswrapper[5072]: I0228 04:29:58.704234 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" podUID="73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b" containerName="horizon-log" containerID="cri-o://96579f554c511ece64d1cad5dc1888a16733204ec0a7f9bef12c790d1494bd05" gracePeriod=30 Feb 28 04:29:58 crc kubenswrapper[5072]: I0228 04:29:58.704266 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" podUID="73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b" containerName="horizon" containerID="cri-o://5ebd71f81f76ab935aa0d5b963704646705e6675f6923ea8c34c394a37fcc5bd" gracePeriod=30 Feb 28 04:30:00 crc kubenswrapper[5072]: I0228 04:30:00.152334 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537550-stc8d"] Feb 28 04:30:00 crc kubenswrapper[5072]: I0228 04:30:00.153445 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537550-stc8d" Feb 28 04:30:00 crc kubenswrapper[5072]: I0228 04:30:00.159441 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537550-vgv7j"] Feb 28 04:30:00 crc kubenswrapper[5072]: I0228 04:30:00.160279 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537550-vgv7j" Feb 28 04:30:00 crc kubenswrapper[5072]: I0228 04:30:00.165237 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 28 04:30:00 crc kubenswrapper[5072]: I0228 04:30:00.165492 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 28 04:30:00 crc kubenswrapper[5072]: I0228 04:30:00.165522 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 04:30:00 crc kubenswrapper[5072]: I0228 04:30:00.165674 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-c8kvx" Feb 28 04:30:00 crc kubenswrapper[5072]: I0228 04:30:00.165879 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 04:30:00 crc kubenswrapper[5072]: I0228 04:30:00.169524 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537550-stc8d"] Feb 28 04:30:00 crc kubenswrapper[5072]: I0228 04:30:00.193865 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537550-vgv7j"] Feb 28 04:30:00 crc kubenswrapper[5072]: I0228 04:30:00.309897 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4zbz\" (UniqueName: \"kubernetes.io/projected/17749401-23d2-4c37-b692-9f163e29b7b7-kube-api-access-t4zbz\") pod \"auto-csr-approver-29537550-stc8d\" (UID: \"17749401-23d2-4c37-b692-9f163e29b7b7\") " pod="openshift-infra/auto-csr-approver-29537550-stc8d" Feb 28 04:30:00 crc kubenswrapper[5072]: I0228 04:30:00.310010 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce-config-volume\") pod \"collect-profiles-29537550-vgv7j\" (UID: \"f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537550-vgv7j" Feb 28 04:30:00 crc kubenswrapper[5072]: I0228 04:30:00.310037 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5jm8\" (UniqueName: \"kubernetes.io/projected/f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce-kube-api-access-w5jm8\") pod \"collect-profiles-29537550-vgv7j\" (UID: \"f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537550-vgv7j" Feb 28 04:30:00 crc kubenswrapper[5072]: I0228 04:30:00.310086 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce-secret-volume\") pod \"collect-profiles-29537550-vgv7j\" (UID: \"f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537550-vgv7j" Feb 28 04:30:00 crc kubenswrapper[5072]: I0228 04:30:00.411818 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce-config-volume\") pod \"collect-profiles-29537550-vgv7j\" (UID: \"f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537550-vgv7j" Feb 28 04:30:00 crc kubenswrapper[5072]: I0228 04:30:00.411855 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5jm8\" (UniqueName: \"kubernetes.io/projected/f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce-kube-api-access-w5jm8\") pod \"collect-profiles-29537550-vgv7j\" (UID: \"f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537550-vgv7j" Feb 28 04:30:00 crc kubenswrapper[5072]: I0228 04:30:00.411893 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce-secret-volume\") pod \"collect-profiles-29537550-vgv7j\" (UID: \"f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537550-vgv7j" Feb 28 04:30:00 crc kubenswrapper[5072]: I0228 04:30:00.411952 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4zbz\" (UniqueName: \"kubernetes.io/projected/17749401-23d2-4c37-b692-9f163e29b7b7-kube-api-access-t4zbz\") pod \"auto-csr-approver-29537550-stc8d\" (UID: \"17749401-23d2-4c37-b692-9f163e29b7b7\") " pod="openshift-infra/auto-csr-approver-29537550-stc8d" Feb 28 04:30:00 crc kubenswrapper[5072]: I0228 04:30:00.413257 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce-config-volume\") pod \"collect-profiles-29537550-vgv7j\" (UID: \"f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537550-vgv7j" Feb 28 04:30:00 crc kubenswrapper[5072]: I0228 04:30:00.418073 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce-secret-volume\") pod \"collect-profiles-29537550-vgv7j\" (UID: \"f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537550-vgv7j" Feb 28 04:30:00 crc kubenswrapper[5072]: I0228 04:30:00.427103 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5jm8\" (UniqueName: \"kubernetes.io/projected/f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce-kube-api-access-w5jm8\") pod \"collect-profiles-29537550-vgv7j\" (UID: \"f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537550-vgv7j" Feb 28 04:30:00 crc kubenswrapper[5072]: I0228 04:30:00.432542 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4zbz\" (UniqueName: \"kubernetes.io/projected/17749401-23d2-4c37-b692-9f163e29b7b7-kube-api-access-t4zbz\") pod \"auto-csr-approver-29537550-stc8d\" (UID: \"17749401-23d2-4c37-b692-9f163e29b7b7\") " pod="openshift-infra/auto-csr-approver-29537550-stc8d" Feb 28 04:30:00 crc kubenswrapper[5072]: I0228 04:30:00.477995 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537550-stc8d" Feb 28 04:30:00 crc kubenswrapper[5072]: I0228 04:30:00.500489 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537550-vgv7j" Feb 28 04:30:00 crc kubenswrapper[5072]: I0228 04:30:00.758735 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537550-vgv7j"] Feb 28 04:30:00 crc kubenswrapper[5072]: I0228 04:30:00.903568 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537550-stc8d"] Feb 28 04:30:00 crc kubenswrapper[5072]: W0228 04:30:00.918231 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17749401_23d2_4c37_b692_9f163e29b7b7.slice/crio-8676bfdabcc08864721ddc702013a57e2b34078c1ae6ea988a42ab31374f1d9a WatchSource:0}: Error finding container 8676bfdabcc08864721ddc702013a57e2b34078c1ae6ea988a42ab31374f1d9a: Status 404 returned error can't find the container with id 8676bfdabcc08864721ddc702013a57e2b34078c1ae6ea988a42ab31374f1d9a Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.436912 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/horizon-845cfdcdb-l9f5g"] Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.438547 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-845cfdcdb-l9f5g" Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.441257 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"horizon-policy" Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.454917 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/horizon-845cfdcdb-l9f5g"] Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.494600 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/horizon-8bb8556c5-jgg85"] Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.494865 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" podUID="fca5f8af-9afd-47d4-8a2a-009df3216bc0" containerName="horizon-log" containerID="cri-o://208a6b0bee8e5b454dc20ee825540d3e859f5febb9d06ce6546a7ae95b659b2e" gracePeriod=30 Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.495221 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" podUID="fca5f8af-9afd-47d4-8a2a-009df3216bc0" containerName="horizon" containerID="cri-o://041ad997be384f6c506ce0ab049948cfac3e3b9949fb73e9da6bd2499a3e7d3f" gracePeriod=30 Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.497821 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/horizon-845cfdcdb-l9f5g"] Feb 28 04:30:01 crc kubenswrapper[5072]: E0228 04:30:01.498741 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config-data horizon-secret-key kube-api-access-ff6bq logs policy scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="horizon-kuttl-tests/horizon-845cfdcdb-l9f5g" podUID="239571e5-6035-44d1-a8b0-870b6bd5c0ef" Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.530252 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff6bq\" (UniqueName: \"kubernetes.io/projected/239571e5-6035-44d1-a8b0-870b6bd5c0ef-kube-api-access-ff6bq\") pod \"horizon-845cfdcdb-l9f5g\" (UID: \"239571e5-6035-44d1-a8b0-870b6bd5c0ef\") " pod="horizon-kuttl-tests/horizon-845cfdcdb-l9f5g" Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.530320 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/239571e5-6035-44d1-a8b0-870b6bd5c0ef-horizon-secret-key\") pod \"horizon-845cfdcdb-l9f5g\" (UID: \"239571e5-6035-44d1-a8b0-870b6bd5c0ef\") " pod="horizon-kuttl-tests/horizon-845cfdcdb-l9f5g" Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.530392 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/239571e5-6035-44d1-a8b0-870b6bd5c0ef-scripts\") pod \"horizon-845cfdcdb-l9f5g\" (UID: \"239571e5-6035-44d1-a8b0-870b6bd5c0ef\") " pod="horizon-kuttl-tests/horizon-845cfdcdb-l9f5g" Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.530438 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/239571e5-6035-44d1-a8b0-870b6bd5c0ef-config-data\") pod \"horizon-845cfdcdb-l9f5g\" (UID: \"239571e5-6035-44d1-a8b0-870b6bd5c0ef\") " pod="horizon-kuttl-tests/horizon-845cfdcdb-l9f5g" Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.530510 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policy\" (UniqueName: \"kubernetes.io/configmap/239571e5-6035-44d1-a8b0-870b6bd5c0ef-policy\") pod \"horizon-845cfdcdb-l9f5g\" (UID: \"239571e5-6035-44d1-a8b0-870b6bd5c0ef\") " pod="horizon-kuttl-tests/horizon-845cfdcdb-l9f5g" Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.530551 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/239571e5-6035-44d1-a8b0-870b6bd5c0ef-logs\") pod \"horizon-845cfdcdb-l9f5g\" (UID: \"239571e5-6035-44d1-a8b0-870b6bd5c0ef\") " pod="horizon-kuttl-tests/horizon-845cfdcdb-l9f5g" Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.631918 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff6bq\" (UniqueName: \"kubernetes.io/projected/239571e5-6035-44d1-a8b0-870b6bd5c0ef-kube-api-access-ff6bq\") pod \"horizon-845cfdcdb-l9f5g\" (UID: \"239571e5-6035-44d1-a8b0-870b6bd5c0ef\") " pod="horizon-kuttl-tests/horizon-845cfdcdb-l9f5g" Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.631989 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/239571e5-6035-44d1-a8b0-870b6bd5c0ef-horizon-secret-key\") pod \"horizon-845cfdcdb-l9f5g\" (UID: \"239571e5-6035-44d1-a8b0-870b6bd5c0ef\") " pod="horizon-kuttl-tests/horizon-845cfdcdb-l9f5g" Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.632027 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/239571e5-6035-44d1-a8b0-870b6bd5c0ef-scripts\") pod \"horizon-845cfdcdb-l9f5g\" (UID: \"239571e5-6035-44d1-a8b0-870b6bd5c0ef\") " pod="horizon-kuttl-tests/horizon-845cfdcdb-l9f5g" Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.632067 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/239571e5-6035-44d1-a8b0-870b6bd5c0ef-config-data\") pod \"horizon-845cfdcdb-l9f5g\" (UID: \"239571e5-6035-44d1-a8b0-870b6bd5c0ef\") " pod="horizon-kuttl-tests/horizon-845cfdcdb-l9f5g" Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.632115 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"policy\" (UniqueName: \"kubernetes.io/configmap/239571e5-6035-44d1-a8b0-870b6bd5c0ef-policy\") pod \"horizon-845cfdcdb-l9f5g\" (UID: \"239571e5-6035-44d1-a8b0-870b6bd5c0ef\") " pod="horizon-kuttl-tests/horizon-845cfdcdb-l9f5g" Feb 28 04:30:01 crc kubenswrapper[5072]: E0228 04:30:01.632140 5072 secret.go:188] Couldn't get secret horizon-kuttl-tests/horizon: secret "horizon" not found Feb 28 04:30:01 crc kubenswrapper[5072]: E0228 04:30:01.632159 5072 configmap.go:193] Couldn't get configMap horizon-kuttl-tests/horizon-scripts: configmap "horizon-scripts" not found Feb 28 04:30:01 crc kubenswrapper[5072]: E0228 04:30:01.632204 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/239571e5-6035-44d1-a8b0-870b6bd5c0ef-scripts podName:239571e5-6035-44d1-a8b0-870b6bd5c0ef nodeName:}" failed. No retries permitted until 2026-02-28 04:30:02.132187861 +0000 UTC m=+1224.126918053 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/239571e5-6035-44d1-a8b0-870b6bd5c0ef-scripts") pod "horizon-845cfdcdb-l9f5g" (UID: "239571e5-6035-44d1-a8b0-870b6bd5c0ef") : configmap "horizon-scripts" not found Feb 28 04:30:01 crc kubenswrapper[5072]: E0228 04:30:01.632217 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/239571e5-6035-44d1-a8b0-870b6bd5c0ef-horizon-secret-key podName:239571e5-6035-44d1-a8b0-870b6bd5c0ef nodeName:}" failed. No retries permitted until 2026-02-28 04:30:02.132211421 +0000 UTC m=+1224.126941613 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "horizon-secret-key" (UniqueName: "kubernetes.io/secret/239571e5-6035-44d1-a8b0-870b6bd5c0ef-horizon-secret-key") pod "horizon-845cfdcdb-l9f5g" (UID: "239571e5-6035-44d1-a8b0-870b6bd5c0ef") : secret "horizon" not found Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.632148 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/239571e5-6035-44d1-a8b0-870b6bd5c0ef-logs\") pod \"horizon-845cfdcdb-l9f5g\" (UID: \"239571e5-6035-44d1-a8b0-870b6bd5c0ef\") " pod="horizon-kuttl-tests/horizon-845cfdcdb-l9f5g" Feb 28 04:30:01 crc kubenswrapper[5072]: E0228 04:30:01.632280 5072 configmap.go:193] Couldn't get configMap horizon-kuttl-tests/horizon-config-data: configmap "horizon-config-data" not found Feb 28 04:30:01 crc kubenswrapper[5072]: E0228 04:30:01.632299 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/239571e5-6035-44d1-a8b0-870b6bd5c0ef-config-data podName:239571e5-6035-44d1-a8b0-870b6bd5c0ef nodeName:}" failed. No retries permitted until 2026-02-28 04:30:02.132293614 +0000 UTC m=+1224.127023806 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/239571e5-6035-44d1-a8b0-870b6bd5c0ef-config-data") pod "horizon-845cfdcdb-l9f5g" (UID: "239571e5-6035-44d1-a8b0-870b6bd5c0ef") : configmap "horizon-config-data" not found Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.632898 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/239571e5-6035-44d1-a8b0-870b6bd5c0ef-logs\") pod \"horizon-845cfdcdb-l9f5g\" (UID: \"239571e5-6035-44d1-a8b0-870b6bd5c0ef\") " pod="horizon-kuttl-tests/horizon-845cfdcdb-l9f5g" Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.633141 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"policy\" (UniqueName: \"kubernetes.io/configmap/239571e5-6035-44d1-a8b0-870b6bd5c0ef-policy\") pod \"horizon-845cfdcdb-l9f5g\" (UID: \"239571e5-6035-44d1-a8b0-870b6bd5c0ef\") " pod="horizon-kuttl-tests/horizon-845cfdcdb-l9f5g" Feb 28 04:30:01 crc kubenswrapper[5072]: E0228 04:30:01.635704 5072 projected.go:194] Error preparing data for projected volume kube-api-access-ff6bq for pod horizon-kuttl-tests/horizon-845cfdcdb-l9f5g: failed to fetch token: serviceaccounts "horizon-horizon" not found Feb 28 04:30:01 crc kubenswrapper[5072]: E0228 04:30:01.635752 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/239571e5-6035-44d1-a8b0-870b6bd5c0ef-kube-api-access-ff6bq podName:239571e5-6035-44d1-a8b0-870b6bd5c0ef nodeName:}" failed. No retries permitted until 2026-02-28 04:30:02.135741231 +0000 UTC m=+1224.130471413 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ff6bq" (UniqueName: "kubernetes.io/projected/239571e5-6035-44d1-a8b0-870b6bd5c0ef-kube-api-access-ff6bq") pod "horizon-845cfdcdb-l9f5g" (UID: "239571e5-6035-44d1-a8b0-870b6bd5c0ef") : failed to fetch token: serviceaccounts "horizon-horizon" not found Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.724276 5072 generic.go:334] "Generic (PLEG): container finished" podID="f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce" containerID="db09ba9960794f258d039da2c0e6da7a23937a214bba1c7a0ec4805ebb986c4d" exitCode=0 Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.724330 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537550-vgv7j" event={"ID":"f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce","Type":"ContainerDied","Data":"db09ba9960794f258d039da2c0e6da7a23937a214bba1c7a0ec4805ebb986c4d"} Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.724382 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537550-vgv7j" event={"ID":"f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce","Type":"ContainerStarted","Data":"c09c621e25869d8759eb85f5f22c15c7ac01a6e2a0e337b7ba47f926c9986fa7"} Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.725763 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537550-stc8d" event={"ID":"17749401-23d2-4c37-b692-9f163e29b7b7","Type":"ContainerStarted","Data":"8676bfdabcc08864721ddc702013a57e2b34078c1ae6ea988a42ab31374f1d9a"} Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.725786 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-845cfdcdb-l9f5g" Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.733079 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-845cfdcdb-l9f5g" Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.835115 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/239571e5-6035-44d1-a8b0-870b6bd5c0ef-logs\") pod \"239571e5-6035-44d1-a8b0-870b6bd5c0ef\" (UID: \"239571e5-6035-44d1-a8b0-870b6bd5c0ef\") " Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.835174 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"policy\" (UniqueName: \"kubernetes.io/configmap/239571e5-6035-44d1-a8b0-870b6bd5c0ef-policy\") pod \"239571e5-6035-44d1-a8b0-870b6bd5c0ef\" (UID: \"239571e5-6035-44d1-a8b0-870b6bd5c0ef\") " Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.835504 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/239571e5-6035-44d1-a8b0-870b6bd5c0ef-logs" (OuterVolumeSpecName: "logs") pod "239571e5-6035-44d1-a8b0-870b6bd5c0ef" (UID: "239571e5-6035-44d1-a8b0-870b6bd5c0ef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.835681 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/239571e5-6035-44d1-a8b0-870b6bd5c0ef-policy" (OuterVolumeSpecName: "policy") pod "239571e5-6035-44d1-a8b0-870b6bd5c0ef" (UID: "239571e5-6035-44d1-a8b0-870b6bd5c0ef"). InnerVolumeSpecName "policy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.835921 5072 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/239571e5-6035-44d1-a8b0-870b6bd5c0ef-logs\") on node \"crc\" DevicePath \"\"" Feb 28 04:30:01 crc kubenswrapper[5072]: I0228 04:30:01.835940 5072 reconciler_common.go:293] "Volume detached for volume \"policy\" (UniqueName: \"kubernetes.io/configmap/239571e5-6035-44d1-a8b0-870b6bd5c0ef-policy\") on node \"crc\" DevicePath \"\"" Feb 28 04:30:02 crc kubenswrapper[5072]: I0228 04:30:02.140422 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/239571e5-6035-44d1-a8b0-870b6bd5c0ef-config-data\") pod \"horizon-845cfdcdb-l9f5g\" (UID: \"239571e5-6035-44d1-a8b0-870b6bd5c0ef\") " pod="horizon-kuttl-tests/horizon-845cfdcdb-l9f5g" Feb 28 04:30:02 crc kubenswrapper[5072]: E0228 04:30:02.140586 5072 configmap.go:193] Couldn't get configMap horizon-kuttl-tests/horizon-config-data: configmap "horizon-config-data" not found Feb 28 04:30:02 crc kubenswrapper[5072]: I0228 04:30:02.140800 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff6bq\" (UniqueName: \"kubernetes.io/projected/239571e5-6035-44d1-a8b0-870b6bd5c0ef-kube-api-access-ff6bq\") pod \"horizon-845cfdcdb-l9f5g\" (UID: \"239571e5-6035-44d1-a8b0-870b6bd5c0ef\") " pod="horizon-kuttl-tests/horizon-845cfdcdb-l9f5g" Feb 28 04:30:02 crc kubenswrapper[5072]: E0228 04:30:02.140835 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/239571e5-6035-44d1-a8b0-870b6bd5c0ef-config-data podName:239571e5-6035-44d1-a8b0-870b6bd5c0ef nodeName:}" failed. No retries permitted until 2026-02-28 04:30:03.140816403 +0000 UTC m=+1225.135546595 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/239571e5-6035-44d1-a8b0-870b6bd5c0ef-config-data") pod "horizon-845cfdcdb-l9f5g" (UID: "239571e5-6035-44d1-a8b0-870b6bd5c0ef") : configmap "horizon-config-data" not found Feb 28 04:30:02 crc kubenswrapper[5072]: I0228 04:30:02.140852 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/239571e5-6035-44d1-a8b0-870b6bd5c0ef-horizon-secret-key\") pod \"horizon-845cfdcdb-l9f5g\" (UID: \"239571e5-6035-44d1-a8b0-870b6bd5c0ef\") " pod="horizon-kuttl-tests/horizon-845cfdcdb-l9f5g" Feb 28 04:30:02 crc kubenswrapper[5072]: I0228 04:30:02.140876 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/239571e5-6035-44d1-a8b0-870b6bd5c0ef-scripts\") pod \"horizon-845cfdcdb-l9f5g\" (UID: \"239571e5-6035-44d1-a8b0-870b6bd5c0ef\") " pod="horizon-kuttl-tests/horizon-845cfdcdb-l9f5g" Feb 28 04:30:02 crc kubenswrapper[5072]: E0228 04:30:02.140996 5072 configmap.go:193] Couldn't get configMap horizon-kuttl-tests/horizon-scripts: configmap "horizon-scripts" not found Feb 28 04:30:02 crc kubenswrapper[5072]: E0228 04:30:02.141046 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/239571e5-6035-44d1-a8b0-870b6bd5c0ef-scripts podName:239571e5-6035-44d1-a8b0-870b6bd5c0ef nodeName:}" failed. No retries permitted until 2026-02-28 04:30:03.14103241 +0000 UTC m=+1225.135762602 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/239571e5-6035-44d1-a8b0-870b6bd5c0ef-scripts") pod "horizon-845cfdcdb-l9f5g" (UID: "239571e5-6035-44d1-a8b0-870b6bd5c0ef") : configmap "horizon-scripts" not found Feb 28 04:30:02 crc kubenswrapper[5072]: E0228 04:30:02.141103 5072 secret.go:188] Couldn't get secret horizon-kuttl-tests/horizon: secret "horizon" not found Feb 28 04:30:02 crc kubenswrapper[5072]: E0228 04:30:02.141124 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/239571e5-6035-44d1-a8b0-870b6bd5c0ef-horizon-secret-key podName:239571e5-6035-44d1-a8b0-870b6bd5c0ef nodeName:}" failed. No retries permitted until 2026-02-28 04:30:03.141117713 +0000 UTC m=+1225.135847895 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "horizon-secret-key" (UniqueName: "kubernetes.io/secret/239571e5-6035-44d1-a8b0-870b6bd5c0ef-horizon-secret-key") pod "horizon-845cfdcdb-l9f5g" (UID: "239571e5-6035-44d1-a8b0-870b6bd5c0ef") : secret "horizon" not found Feb 28 04:30:02 crc kubenswrapper[5072]: E0228 04:30:02.143901 5072 projected.go:194] Error preparing data for projected volume kube-api-access-ff6bq for pod horizon-kuttl-tests/horizon-845cfdcdb-l9f5g: failed to fetch token: serviceaccounts "horizon-horizon" not found Feb 28 04:30:02 crc kubenswrapper[5072]: E0228 04:30:02.143994 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/239571e5-6035-44d1-a8b0-870b6bd5c0ef-kube-api-access-ff6bq podName:239571e5-6035-44d1-a8b0-870b6bd5c0ef nodeName:}" failed. No retries permitted until 2026-02-28 04:30:03.143968302 +0000 UTC m=+1225.138698544 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ff6bq" (UniqueName: "kubernetes.io/projected/239571e5-6035-44d1-a8b0-870b6bd5c0ef-kube-api-access-ff6bq") pod "horizon-845cfdcdb-l9f5g" (UID: "239571e5-6035-44d1-a8b0-870b6bd5c0ef") : failed to fetch token: serviceaccounts "horizon-horizon" not found Feb 28 04:30:02 crc kubenswrapper[5072]: I0228 04:30:02.736492 5072 generic.go:334] "Generic (PLEG): container finished" podID="73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b" containerID="5ebd71f81f76ab935aa0d5b963704646705e6675f6923ea8c34c394a37fcc5bd" exitCode=0 Feb 28 04:30:02 crc kubenswrapper[5072]: I0228 04:30:02.736595 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" event={"ID":"73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b","Type":"ContainerDied","Data":"5ebd71f81f76ab935aa0d5b963704646705e6675f6923ea8c34c394a37fcc5bd"} Feb 28 04:30:02 crc kubenswrapper[5072]: I0228 04:30:02.736652 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-845cfdcdb-l9f5g" Feb 28 04:30:02 crc kubenswrapper[5072]: I0228 04:30:02.790484 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/horizon-845cfdcdb-l9f5g"] Feb 28 04:30:02 crc kubenswrapper[5072]: I0228 04:30:02.799506 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/horizon-845cfdcdb-l9f5g"] Feb 28 04:30:02 crc kubenswrapper[5072]: I0228 04:30:02.950838 5072 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/239571e5-6035-44d1-a8b0-870b6bd5c0ef-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 04:30:02 crc kubenswrapper[5072]: I0228 04:30:02.950868 5072 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/239571e5-6035-44d1-a8b0-870b6bd5c0ef-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 28 04:30:02 crc kubenswrapper[5072]: I0228 04:30:02.950878 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff6bq\" (UniqueName: \"kubernetes.io/projected/239571e5-6035-44d1-a8b0-870b6bd5c0ef-kube-api-access-ff6bq\") on node \"crc\" DevicePath \"\"" Feb 28 04:30:02 crc kubenswrapper[5072]: I0228 04:30:02.950887 5072 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/239571e5-6035-44d1-a8b0-870b6bd5c0ef-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 04:30:03 crc kubenswrapper[5072]: I0228 04:30:03.069406 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537550-vgv7j" Feb 28 04:30:03 crc kubenswrapper[5072]: I0228 04:30:03.156272 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce-config-volume\") pod \"f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce\" (UID: \"f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce\") " Feb 28 04:30:03 crc kubenswrapper[5072]: I0228 04:30:03.156380 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce-secret-volume\") pod \"f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce\" (UID: \"f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce\") " Feb 28 04:30:03 crc kubenswrapper[5072]: I0228 04:30:03.156696 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5jm8\" (UniqueName: \"kubernetes.io/projected/f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce-kube-api-access-w5jm8\") pod \"f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce\" (UID: \"f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce\") " Feb 28 04:30:03 crc kubenswrapper[5072]: I0228 04:30:03.157173 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce-config-volume" (OuterVolumeSpecName: "config-volume") pod "f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce" (UID: "f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:30:03 crc kubenswrapper[5072]: I0228 04:30:03.162321 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce" (UID: "f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:30:03 crc kubenswrapper[5072]: I0228 04:30:03.162607 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce-kube-api-access-w5jm8" (OuterVolumeSpecName: "kube-api-access-w5jm8") pod "f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce" (UID: "f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce"). InnerVolumeSpecName "kube-api-access-w5jm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:30:03 crc kubenswrapper[5072]: I0228 04:30:03.258531 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5jm8\" (UniqueName: \"kubernetes.io/projected/f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce-kube-api-access-w5jm8\") on node \"crc\" DevicePath \"\"" Feb 28 04:30:03 crc kubenswrapper[5072]: I0228 04:30:03.258572 5072 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce-config-volume\") on node \"crc\" DevicePath \"\"" Feb 28 04:30:03 crc kubenswrapper[5072]: I0228 04:30:03.258587 5072 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 28 04:30:03 crc kubenswrapper[5072]: I0228 04:30:03.760354 5072 generic.go:334] "Generic (PLEG): container finished" podID="17749401-23d2-4c37-b692-9f163e29b7b7" containerID="f98bb7ff081c8fd475cb4cb229f88a28dfc46cf3835b5762e73a883639c61235" exitCode=0 Feb 28 04:30:03 crc kubenswrapper[5072]: I0228 04:30:03.760722 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537550-stc8d" event={"ID":"17749401-23d2-4c37-b692-9f163e29b7b7","Type":"ContainerDied","Data":"f98bb7ff081c8fd475cb4cb229f88a28dfc46cf3835b5762e73a883639c61235"} Feb 28 04:30:03 crc kubenswrapper[5072]: I0228 04:30:03.764443 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537550-vgv7j" event={"ID":"f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce","Type":"ContainerDied","Data":"c09c621e25869d8759eb85f5f22c15c7ac01a6e2a0e337b7ba47f926c9986fa7"} Feb 28 04:30:03 crc kubenswrapper[5072]: I0228 04:30:03.764482 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c09c621e25869d8759eb85f5f22c15c7ac01a6e2a0e337b7ba47f926c9986fa7" Feb 28 04:30:03 crc kubenswrapper[5072]: I0228 04:30:03.764543 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537550-vgv7j" Feb 28 04:30:04 crc kubenswrapper[5072]: I0228 04:30:04.164692 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" podUID="73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b" containerName="horizon" probeResult="failure" output="Get \"http://10.217.0.87:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.87:8080: connect: connection refused" Feb 28 04:30:04 crc kubenswrapper[5072]: I0228 04:30:04.618877 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" podUID="fca5f8af-9afd-47d4-8a2a-009df3216bc0" containerName="horizon" probeResult="failure" output="Get \"http://10.217.0.88:8080/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:33144->10.217.0.88:8080: read: connection reset by peer" Feb 28 04:30:04 crc kubenswrapper[5072]: I0228 04:30:04.669721 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="239571e5-6035-44d1-a8b0-870b6bd5c0ef" path="/var/lib/kubelet/pods/239571e5-6035-44d1-a8b0-870b6bd5c0ef/volumes" Feb 28 04:30:04 crc kubenswrapper[5072]: I0228 04:30:04.772621 5072 generic.go:334] "Generic (PLEG): container finished" podID="fca5f8af-9afd-47d4-8a2a-009df3216bc0" containerID="041ad997be384f6c506ce0ab049948cfac3e3b9949fb73e9da6bd2499a3e7d3f" exitCode=0 Feb 28 04:30:04 crc kubenswrapper[5072]: I0228 04:30:04.772705 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" event={"ID":"fca5f8af-9afd-47d4-8a2a-009df3216bc0","Type":"ContainerDied","Data":"041ad997be384f6c506ce0ab049948cfac3e3b9949fb73e9da6bd2499a3e7d3f"} Feb 28 04:30:05 crc kubenswrapper[5072]: I0228 04:30:05.091663 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537550-stc8d" Feb 28 04:30:05 crc kubenswrapper[5072]: I0228 04:30:05.182289 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4zbz\" (UniqueName: \"kubernetes.io/projected/17749401-23d2-4c37-b692-9f163e29b7b7-kube-api-access-t4zbz\") pod \"17749401-23d2-4c37-b692-9f163e29b7b7\" (UID: \"17749401-23d2-4c37-b692-9f163e29b7b7\") " Feb 28 04:30:05 crc kubenswrapper[5072]: I0228 04:30:05.188959 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17749401-23d2-4c37-b692-9f163e29b7b7-kube-api-access-t4zbz" (OuterVolumeSpecName: "kube-api-access-t4zbz") pod "17749401-23d2-4c37-b692-9f163e29b7b7" (UID: "17749401-23d2-4c37-b692-9f163e29b7b7"). InnerVolumeSpecName "kube-api-access-t4zbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:30:05 crc kubenswrapper[5072]: I0228 04:30:05.284392 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4zbz\" (UniqueName: \"kubernetes.io/projected/17749401-23d2-4c37-b692-9f163e29b7b7-kube-api-access-t4zbz\") on node \"crc\" DevicePath \"\"" Feb 28 04:30:05 crc kubenswrapper[5072]: I0228 04:30:05.780909 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537550-stc8d" event={"ID":"17749401-23d2-4c37-b692-9f163e29b7b7","Type":"ContainerDied","Data":"8676bfdabcc08864721ddc702013a57e2b34078c1ae6ea988a42ab31374f1d9a"} Feb 28 04:30:05 crc kubenswrapper[5072]: I0228 04:30:05.780950 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537550-stc8d" Feb 28 04:30:05 crc kubenswrapper[5072]: I0228 04:30:05.780957 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8676bfdabcc08864721ddc702013a57e2b34078c1ae6ea988a42ab31374f1d9a" Feb 28 04:30:06 crc kubenswrapper[5072]: I0228 04:30:06.156668 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537544-msjzd"] Feb 28 04:30:06 crc kubenswrapper[5072]: I0228 04:30:06.163973 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537544-msjzd"] Feb 28 04:30:06 crc kubenswrapper[5072]: I0228 04:30:06.667993 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e01f23d6-12a8-4823-9d45-763f79cca9a8" path="/var/lib/kubelet/pods/e01f23d6-12a8-4823-9d45-763f79cca9a8/volumes" Feb 28 04:30:14 crc kubenswrapper[5072]: I0228 04:30:14.164243 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" podUID="73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b" containerName="horizon" probeResult="failure" output="Get \"http://10.217.0.87:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.87:8080: connect: connection refused" Feb 28 04:30:14 crc kubenswrapper[5072]: I0228 04:30:14.261452 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" podUID="fca5f8af-9afd-47d4-8a2a-009df3216bc0" containerName="horizon" probeResult="failure" output="Get \"http://10.217.0.88:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.88:8080: connect: connection refused" Feb 28 04:30:20 crc kubenswrapper[5072]: I0228 04:30:20.105774 5072 patch_prober.go:28] interesting pod/machine-config-daemon-5lrpf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:30:20 crc kubenswrapper[5072]: I0228 04:30:20.106108 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:30:20 crc kubenswrapper[5072]: I0228 04:30:20.106155 5072 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" Feb 28 04:30:20 crc kubenswrapper[5072]: I0228 04:30:20.106862 5072 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"12b4b3f484e46c0cfc12fc90f1da58cb1b716b35bd291d441c02d1fe8abc9e04"} pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 28 04:30:20 crc kubenswrapper[5072]: I0228 04:30:20.106914 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" containerID="cri-o://12b4b3f484e46c0cfc12fc90f1da58cb1b716b35bd291d441c02d1fe8abc9e04" gracePeriod=600 Feb 28 04:30:20 crc kubenswrapper[5072]: I0228 04:30:20.500207 5072 scope.go:117] "RemoveContainer" containerID="307c6a4dfd8b3e4e860aef830a0db6f175c69340657a1e7afe18a4191498f545" Feb 28 04:30:20 crc kubenswrapper[5072]: I0228 04:30:20.886910 5072 generic.go:334] "Generic (PLEG): container finished" podID="a035bbab-1d8f-4120-aaf7-88984d936939" containerID="12b4b3f484e46c0cfc12fc90f1da58cb1b716b35bd291d441c02d1fe8abc9e04" exitCode=0 Feb 28 04:30:20 crc kubenswrapper[5072]: I0228 04:30:20.886957 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" event={"ID":"a035bbab-1d8f-4120-aaf7-88984d936939","Type":"ContainerDied","Data":"12b4b3f484e46c0cfc12fc90f1da58cb1b716b35bd291d441c02d1fe8abc9e04"} Feb 28 04:30:20 crc kubenswrapper[5072]: I0228 04:30:20.887030 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" event={"ID":"a035bbab-1d8f-4120-aaf7-88984d936939","Type":"ContainerStarted","Data":"27d2fb4f87a04571b7b0a9792f832a0142945d828b2f05c5af46a4307532ae67"} Feb 28 04:30:20 crc kubenswrapper[5072]: I0228 04:30:20.887047 5072 scope.go:117] "RemoveContainer" containerID="e53c192baa0cf41417fc28e90ae7b328b499a54a241b5398391870c675f33023" Feb 28 04:30:24 crc kubenswrapper[5072]: I0228 04:30:24.164506 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" podUID="73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b" containerName="horizon" probeResult="failure" output="Get \"http://10.217.0.87:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.87:8080: connect: connection refused" Feb 28 04:30:24 crc kubenswrapper[5072]: I0228 04:30:24.165106 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" Feb 28 04:30:24 crc kubenswrapper[5072]: I0228 04:30:24.261993 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" podUID="fca5f8af-9afd-47d4-8a2a-009df3216bc0" containerName="horizon" probeResult="failure" output="Get \"http://10.217.0.88:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.88:8080: connect: connection refused" Feb 28 04:30:24 crc kubenswrapper[5072]: I0228 04:30:24.262138 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" Feb 28 04:30:28 crc kubenswrapper[5072]: I0228 04:30:28.971799 5072 generic.go:334] "Generic (PLEG): container finished" podID="73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b" containerID="96579f554c511ece64d1cad5dc1888a16733204ec0a7f9bef12c790d1494bd05" exitCode=137 Feb 28 04:30:28 crc kubenswrapper[5072]: I0228 04:30:28.971889 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" event={"ID":"73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b","Type":"ContainerDied","Data":"96579f554c511ece64d1cad5dc1888a16733204ec0a7f9bef12c790d1494bd05"} Feb 28 04:30:29 crc kubenswrapper[5072]: I0228 04:30:29.048779 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" Feb 28 04:30:29 crc kubenswrapper[5072]: I0228 04:30:29.180182 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b-logs\") pod \"73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b\" (UID: \"73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b\") " Feb 28 04:30:29 crc kubenswrapper[5072]: I0228 04:30:29.180249 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b-scripts\") pod \"73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b\" (UID: \"73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b\") " Feb 28 04:30:29 crc kubenswrapper[5072]: I0228 04:30:29.180365 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b-horizon-secret-key\") pod \"73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b\" (UID: \"73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b\") " Feb 28 04:30:29 crc kubenswrapper[5072]: I0228 04:30:29.180437 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b-config-data\") pod \"73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b\" (UID: \"73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b\") " Feb 28 04:30:29 crc kubenswrapper[5072]: I0228 04:30:29.180481 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7xjm\" (UniqueName: \"kubernetes.io/projected/73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b-kube-api-access-q7xjm\") pod \"73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b\" (UID: \"73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b\") " Feb 28 04:30:29 crc kubenswrapper[5072]: I0228 04:30:29.181962 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b-logs" (OuterVolumeSpecName: "logs") pod "73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b" (UID: "73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:30:29 crc kubenswrapper[5072]: I0228 04:30:29.187827 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b" (UID: "73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:30:29 crc kubenswrapper[5072]: I0228 04:30:29.188806 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b-kube-api-access-q7xjm" (OuterVolumeSpecName: "kube-api-access-q7xjm") pod "73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b" (UID: "73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b"). InnerVolumeSpecName "kube-api-access-q7xjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:30:29 crc kubenswrapper[5072]: I0228 04:30:29.202684 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b-config-data" (OuterVolumeSpecName: "config-data") pod "73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b" (UID: "73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:30:29 crc kubenswrapper[5072]: I0228 04:30:29.204445 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b-scripts" (OuterVolumeSpecName: "scripts") pod "73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b" (UID: "73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:30:29 crc kubenswrapper[5072]: I0228 04:30:29.282146 5072 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 04:30:29 crc kubenswrapper[5072]: I0228 04:30:29.282427 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7xjm\" (UniqueName: \"kubernetes.io/projected/73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b-kube-api-access-q7xjm\") on node \"crc\" DevicePath \"\"" Feb 28 04:30:29 crc kubenswrapper[5072]: I0228 04:30:29.282514 5072 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b-logs\") on node \"crc\" DevicePath \"\"" Feb 28 04:30:29 crc kubenswrapper[5072]: I0228 04:30:29.282570 5072 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 04:30:29 crc kubenswrapper[5072]: I0228 04:30:29.282658 5072 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 28 04:30:29 crc kubenswrapper[5072]: I0228 04:30:29.979708 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" event={"ID":"73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b","Type":"ContainerDied","Data":"2894b8fbcb7627c16db881665476b6cac64f33dd8306a09116b465dae9022e6e"} Feb 28 04:30:29 crc kubenswrapper[5072]: I0228 04:30:29.979733 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-6675bd755-28f6k" Feb 28 04:30:29 crc kubenswrapper[5072]: I0228 04:30:29.979767 5072 scope.go:117] "RemoveContainer" containerID="5ebd71f81f76ab935aa0d5b963704646705e6675f6923ea8c34c394a37fcc5bd" Feb 28 04:30:30 crc kubenswrapper[5072]: I0228 04:30:30.010953 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/horizon-6675bd755-28f6k"] Feb 28 04:30:30 crc kubenswrapper[5072]: I0228 04:30:30.017671 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/horizon-6675bd755-28f6k"] Feb 28 04:30:30 crc kubenswrapper[5072]: I0228 04:30:30.143568 5072 scope.go:117] "RemoveContainer" containerID="96579f554c511ece64d1cad5dc1888a16733204ec0a7f9bef12c790d1494bd05" Feb 28 04:30:30 crc kubenswrapper[5072]: I0228 04:30:30.670178 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b" path="/var/lib/kubelet/pods/73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b/volumes" Feb 28 04:30:31 crc kubenswrapper[5072]: I0228 04:30:31.815624 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" Feb 28 04:30:31 crc kubenswrapper[5072]: I0228 04:30:31.921462 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fca5f8af-9afd-47d4-8a2a-009df3216bc0-scripts\") pod \"fca5f8af-9afd-47d4-8a2a-009df3216bc0\" (UID: \"fca5f8af-9afd-47d4-8a2a-009df3216bc0\") " Feb 28 04:30:31 crc kubenswrapper[5072]: I0228 04:30:31.921542 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fca5f8af-9afd-47d4-8a2a-009df3216bc0-horizon-secret-key\") pod \"fca5f8af-9afd-47d4-8a2a-009df3216bc0\" (UID: \"fca5f8af-9afd-47d4-8a2a-009df3216bc0\") " Feb 28 04:30:31 crc kubenswrapper[5072]: I0228 04:30:31.921600 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96fm4\" (UniqueName: \"kubernetes.io/projected/fca5f8af-9afd-47d4-8a2a-009df3216bc0-kube-api-access-96fm4\") pod \"fca5f8af-9afd-47d4-8a2a-009df3216bc0\" (UID: \"fca5f8af-9afd-47d4-8a2a-009df3216bc0\") " Feb 28 04:30:31 crc kubenswrapper[5072]: I0228 04:30:31.921629 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fca5f8af-9afd-47d4-8a2a-009df3216bc0-config-data\") pod \"fca5f8af-9afd-47d4-8a2a-009df3216bc0\" (UID: \"fca5f8af-9afd-47d4-8a2a-009df3216bc0\") " Feb 28 04:30:31 crc kubenswrapper[5072]: I0228 04:30:31.921679 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fca5f8af-9afd-47d4-8a2a-009df3216bc0-logs\") pod \"fca5f8af-9afd-47d4-8a2a-009df3216bc0\" (UID: \"fca5f8af-9afd-47d4-8a2a-009df3216bc0\") " Feb 28 04:30:31 crc kubenswrapper[5072]: I0228 04:30:31.922292 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fca5f8af-9afd-47d4-8a2a-009df3216bc0-logs" (OuterVolumeSpecName: "logs") pod "fca5f8af-9afd-47d4-8a2a-009df3216bc0" (UID: "fca5f8af-9afd-47d4-8a2a-009df3216bc0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:30:31 crc kubenswrapper[5072]: I0228 04:30:31.926818 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fca5f8af-9afd-47d4-8a2a-009df3216bc0-kube-api-access-96fm4" (OuterVolumeSpecName: "kube-api-access-96fm4") pod "fca5f8af-9afd-47d4-8a2a-009df3216bc0" (UID: "fca5f8af-9afd-47d4-8a2a-009df3216bc0"). InnerVolumeSpecName "kube-api-access-96fm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:30:31 crc kubenswrapper[5072]: I0228 04:30:31.927068 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fca5f8af-9afd-47d4-8a2a-009df3216bc0-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "fca5f8af-9afd-47d4-8a2a-009df3216bc0" (UID: "fca5f8af-9afd-47d4-8a2a-009df3216bc0"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:30:31 crc kubenswrapper[5072]: I0228 04:30:31.937571 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fca5f8af-9afd-47d4-8a2a-009df3216bc0-config-data" (OuterVolumeSpecName: "config-data") pod "fca5f8af-9afd-47d4-8a2a-009df3216bc0" (UID: "fca5f8af-9afd-47d4-8a2a-009df3216bc0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:30:31 crc kubenswrapper[5072]: I0228 04:30:31.938787 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fca5f8af-9afd-47d4-8a2a-009df3216bc0-scripts" (OuterVolumeSpecName: "scripts") pod "fca5f8af-9afd-47d4-8a2a-009df3216bc0" (UID: "fca5f8af-9afd-47d4-8a2a-009df3216bc0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:30:31 crc kubenswrapper[5072]: I0228 04:30:31.993305 5072 generic.go:334] "Generic (PLEG): container finished" podID="fca5f8af-9afd-47d4-8a2a-009df3216bc0" containerID="208a6b0bee8e5b454dc20ee825540d3e859f5febb9d06ce6546a7ae95b659b2e" exitCode=137 Feb 28 04:30:31 crc kubenswrapper[5072]: I0228 04:30:31.993389 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" Feb 28 04:30:31 crc kubenswrapper[5072]: I0228 04:30:31.993396 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" event={"ID":"fca5f8af-9afd-47d4-8a2a-009df3216bc0","Type":"ContainerDied","Data":"208a6b0bee8e5b454dc20ee825540d3e859f5febb9d06ce6546a7ae95b659b2e"} Feb 28 04:30:31 crc kubenswrapper[5072]: I0228 04:30:31.993720 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-8bb8556c5-jgg85" event={"ID":"fca5f8af-9afd-47d4-8a2a-009df3216bc0","Type":"ContainerDied","Data":"7f055a30bd2a43f11106e9030642daaca51b54626dfe5a6aa3e4298e15d9adfa"} Feb 28 04:30:31 crc kubenswrapper[5072]: I0228 04:30:31.993738 5072 scope.go:117] "RemoveContainer" containerID="041ad997be384f6c506ce0ab049948cfac3e3b9949fb73e9da6bd2499a3e7d3f" Feb 28 04:30:32 crc kubenswrapper[5072]: I0228 04:30:32.022917 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/horizon-8bb8556c5-jgg85"] Feb 28 04:30:32 crc kubenswrapper[5072]: I0228 04:30:32.023814 5072 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fca5f8af-9afd-47d4-8a2a-009df3216bc0-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 04:30:32 crc kubenswrapper[5072]: I0228 04:30:32.023897 5072 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fca5f8af-9afd-47d4-8a2a-009df3216bc0-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 28 04:30:32 crc kubenswrapper[5072]: I0228 04:30:32.023917 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96fm4\" (UniqueName: \"kubernetes.io/projected/fca5f8af-9afd-47d4-8a2a-009df3216bc0-kube-api-access-96fm4\") on node \"crc\" DevicePath \"\"" Feb 28 04:30:32 crc kubenswrapper[5072]: I0228 04:30:32.023929 5072 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fca5f8af-9afd-47d4-8a2a-009df3216bc0-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 04:30:32 crc kubenswrapper[5072]: I0228 04:30:32.023941 5072 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fca5f8af-9afd-47d4-8a2a-009df3216bc0-logs\") on node \"crc\" DevicePath \"\"" Feb 28 04:30:32 crc kubenswrapper[5072]: I0228 04:30:32.028198 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/horizon-8bb8556c5-jgg85"] Feb 28 04:30:32 crc kubenswrapper[5072]: I0228 04:30:32.154536 5072 scope.go:117] "RemoveContainer" containerID="208a6b0bee8e5b454dc20ee825540d3e859f5febb9d06ce6546a7ae95b659b2e" Feb 28 04:30:32 crc kubenswrapper[5072]: I0228 04:30:32.170606 5072 scope.go:117] "RemoveContainer" containerID="041ad997be384f6c506ce0ab049948cfac3e3b9949fb73e9da6bd2499a3e7d3f" Feb 28 04:30:32 crc kubenswrapper[5072]: E0228 04:30:32.171086 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"041ad997be384f6c506ce0ab049948cfac3e3b9949fb73e9da6bd2499a3e7d3f\": container with ID starting with 041ad997be384f6c506ce0ab049948cfac3e3b9949fb73e9da6bd2499a3e7d3f not found: ID does not exist" containerID="041ad997be384f6c506ce0ab049948cfac3e3b9949fb73e9da6bd2499a3e7d3f" Feb 28 04:30:32 crc kubenswrapper[5072]: I0228 04:30:32.171127 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"041ad997be384f6c506ce0ab049948cfac3e3b9949fb73e9da6bd2499a3e7d3f"} err="failed to get container status \"041ad997be384f6c506ce0ab049948cfac3e3b9949fb73e9da6bd2499a3e7d3f\": rpc error: code = NotFound desc = could not find container \"041ad997be384f6c506ce0ab049948cfac3e3b9949fb73e9da6bd2499a3e7d3f\": container with ID starting with 041ad997be384f6c506ce0ab049948cfac3e3b9949fb73e9da6bd2499a3e7d3f not found: ID does not exist" Feb 28 04:30:32 crc kubenswrapper[5072]: I0228 04:30:32.171155 5072 scope.go:117] "RemoveContainer" containerID="208a6b0bee8e5b454dc20ee825540d3e859f5febb9d06ce6546a7ae95b659b2e" Feb 28 04:30:32 crc kubenswrapper[5072]: E0228 04:30:32.171615 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"208a6b0bee8e5b454dc20ee825540d3e859f5febb9d06ce6546a7ae95b659b2e\": container with ID starting with 208a6b0bee8e5b454dc20ee825540d3e859f5febb9d06ce6546a7ae95b659b2e not found: ID does not exist" containerID="208a6b0bee8e5b454dc20ee825540d3e859f5febb9d06ce6546a7ae95b659b2e" Feb 28 04:30:32 crc kubenswrapper[5072]: I0228 04:30:32.171675 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"208a6b0bee8e5b454dc20ee825540d3e859f5febb9d06ce6546a7ae95b659b2e"} err="failed to get container status \"208a6b0bee8e5b454dc20ee825540d3e859f5febb9d06ce6546a7ae95b659b2e\": rpc error: code = NotFound desc = could not find container \"208a6b0bee8e5b454dc20ee825540d3e859f5febb9d06ce6546a7ae95b659b2e\": container with ID starting with 208a6b0bee8e5b454dc20ee825540d3e859f5febb9d06ce6546a7ae95b659b2e not found: ID does not exist" Feb 28 04:30:32 crc kubenswrapper[5072]: I0228 04:30:32.667759 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fca5f8af-9afd-47d4-8a2a-009df3216bc0" path="/var/lib/kubelet/pods/fca5f8af-9afd-47d4-8a2a-009df3216bc0/volumes" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.488079 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/horizon-5b545c459d-rtjn2"] Feb 28 04:30:33 crc kubenswrapper[5072]: E0228 04:30:33.488704 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17749401-23d2-4c37-b692-9f163e29b7b7" containerName="oc" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.488716 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="17749401-23d2-4c37-b692-9f163e29b7b7" containerName="oc" Feb 28 04:30:33 crc kubenswrapper[5072]: E0228 04:30:33.488739 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b" containerName="horizon" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.488745 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b" containerName="horizon" Feb 28 04:30:33 crc kubenswrapper[5072]: E0228 04:30:33.488754 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce" containerName="collect-profiles" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.488762 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce" containerName="collect-profiles" Feb 28 04:30:33 crc kubenswrapper[5072]: E0228 04:30:33.488770 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fca5f8af-9afd-47d4-8a2a-009df3216bc0" containerName="horizon" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.488777 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="fca5f8af-9afd-47d4-8a2a-009df3216bc0" containerName="horizon" Feb 28 04:30:33 crc kubenswrapper[5072]: E0228 04:30:33.488785 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b" containerName="horizon-log" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.488792 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b" containerName="horizon-log" Feb 28 04:30:33 crc kubenswrapper[5072]: E0228 04:30:33.488805 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fca5f8af-9afd-47d4-8a2a-009df3216bc0" containerName="horizon-log" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.488813 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="fca5f8af-9afd-47d4-8a2a-009df3216bc0" containerName="horizon-log" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.490826 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b" containerName="horizon-log" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.490845 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="17749401-23d2-4c37-b692-9f163e29b7b7" containerName="oc" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.490894 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="fca5f8af-9afd-47d4-8a2a-009df3216bc0" containerName="horizon" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.490905 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="73c98e9f-1dbd-4ff9-97a8-0d5e1c9b567b" containerName="horizon" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.490913 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="fca5f8af-9afd-47d4-8a2a-009df3216bc0" containerName="horizon-log" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.490932 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6b8b5b3-38ba-4fc6-a92a-16b3ae90e4ce" containerName="collect-profiles" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.492822 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.495674 5072 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"cert-horizon-svc" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.495931 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"horizon-config-data" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.496188 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"horizon-scripts" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.496342 5072 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"horizon-horizon-dockercfg-dlpt9" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.496994 5072 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"horizon" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.498324 5072 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"combined-ca-bundle" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.513193 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/horizon-5b545c459d-rtjn2"] Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.568761 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/horizon-579fd4dcd4-kw727"] Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.570473 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.583635 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/horizon-579fd4dcd4-kw727"] Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.632039 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d0b5a237-8378-48db-b4b7-b624fe427c34-horizon-secret-key\") pod \"horizon-5b545c459d-rtjn2\" (UID: \"d0b5a237-8378-48db-b4b7-b624fe427c34\") " pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.632091 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0b5a237-8378-48db-b4b7-b624fe427c34-horizon-tls-certs\") pod \"horizon-5b545c459d-rtjn2\" (UID: \"d0b5a237-8378-48db-b4b7-b624fe427c34\") " pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.632126 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0b5a237-8378-48db-b4b7-b624fe427c34-config-data\") pod \"horizon-5b545c459d-rtjn2\" (UID: \"d0b5a237-8378-48db-b4b7-b624fe427c34\") " pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.632167 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0b5a237-8378-48db-b4b7-b624fe427c34-scripts\") pod \"horizon-5b545c459d-rtjn2\" (UID: \"d0b5a237-8378-48db-b4b7-b624fe427c34\") " pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.632217 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b5a237-8378-48db-b4b7-b624fe427c34-combined-ca-bundle\") pod \"horizon-5b545c459d-rtjn2\" (UID: \"d0b5a237-8378-48db-b4b7-b624fe427c34\") " pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.632254 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsj4s\" (UniqueName: \"kubernetes.io/projected/d0b5a237-8378-48db-b4b7-b624fe427c34-kube-api-access-xsj4s\") pod \"horizon-5b545c459d-rtjn2\" (UID: \"d0b5a237-8378-48db-b4b7-b624fe427c34\") " pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.632296 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0b5a237-8378-48db-b4b7-b624fe427c34-logs\") pod \"horizon-5b545c459d-rtjn2\" (UID: \"d0b5a237-8378-48db-b4b7-b624fe427c34\") " pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.734023 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0b5a237-8378-48db-b4b7-b624fe427c34-config-data\") pod \"horizon-5b545c459d-rtjn2\" (UID: \"d0b5a237-8378-48db-b4b7-b624fe427c34\") " pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.734101 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f72zx\" (UniqueName: \"kubernetes.io/projected/6c084dea-7515-4696-9e28-a5a332d954ad-kube-api-access-f72zx\") pod \"horizon-579fd4dcd4-kw727\" (UID: \"6c084dea-7515-4696-9e28-a5a332d954ad\") " pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.734139 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0b5a237-8378-48db-b4b7-b624fe427c34-scripts\") pod \"horizon-5b545c459d-rtjn2\" (UID: \"d0b5a237-8378-48db-b4b7-b624fe427c34\") " pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.734164 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6c084dea-7515-4696-9e28-a5a332d954ad-horizon-secret-key\") pod \"horizon-579fd4dcd4-kw727\" (UID: \"6c084dea-7515-4696-9e28-a5a332d954ad\") " pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.734195 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c084dea-7515-4696-9e28-a5a332d954ad-horizon-tls-certs\") pod \"horizon-579fd4dcd4-kw727\" (UID: \"6c084dea-7515-4696-9e28-a5a332d954ad\") " pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.734230 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6c084dea-7515-4696-9e28-a5a332d954ad-config-data\") pod \"horizon-579fd4dcd4-kw727\" (UID: \"6c084dea-7515-4696-9e28-a5a332d954ad\") " pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.734253 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c084dea-7515-4696-9e28-a5a332d954ad-logs\") pod \"horizon-579fd4dcd4-kw727\" (UID: \"6c084dea-7515-4696-9e28-a5a332d954ad\") " pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.734295 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b5a237-8378-48db-b4b7-b624fe427c34-combined-ca-bundle\") pod \"horizon-5b545c459d-rtjn2\" (UID: \"d0b5a237-8378-48db-b4b7-b624fe427c34\") " pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.734331 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsj4s\" (UniqueName: \"kubernetes.io/projected/d0b5a237-8378-48db-b4b7-b624fe427c34-kube-api-access-xsj4s\") pod \"horizon-5b545c459d-rtjn2\" (UID: \"d0b5a237-8378-48db-b4b7-b624fe427c34\") " pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.734356 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c084dea-7515-4696-9e28-a5a332d954ad-combined-ca-bundle\") pod \"horizon-579fd4dcd4-kw727\" (UID: \"6c084dea-7515-4696-9e28-a5a332d954ad\") " pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.734408 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0b5a237-8378-48db-b4b7-b624fe427c34-logs\") pod \"horizon-5b545c459d-rtjn2\" (UID: \"d0b5a237-8378-48db-b4b7-b624fe427c34\") " pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.734446 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d0b5a237-8378-48db-b4b7-b624fe427c34-horizon-secret-key\") pod \"horizon-5b545c459d-rtjn2\" (UID: \"d0b5a237-8378-48db-b4b7-b624fe427c34\") " pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.734477 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c084dea-7515-4696-9e28-a5a332d954ad-scripts\") pod \"horizon-579fd4dcd4-kw727\" (UID: \"6c084dea-7515-4696-9e28-a5a332d954ad\") " pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.734506 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0b5a237-8378-48db-b4b7-b624fe427c34-horizon-tls-certs\") pod \"horizon-5b545c459d-rtjn2\" (UID: \"d0b5a237-8378-48db-b4b7-b624fe427c34\") " pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.735388 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0b5a237-8378-48db-b4b7-b624fe427c34-logs\") pod \"horizon-5b545c459d-rtjn2\" (UID: \"d0b5a237-8378-48db-b4b7-b624fe427c34\") " pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.735710 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0b5a237-8378-48db-b4b7-b624fe427c34-scripts\") pod \"horizon-5b545c459d-rtjn2\" (UID: \"d0b5a237-8378-48db-b4b7-b624fe427c34\") " pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.735913 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0b5a237-8378-48db-b4b7-b624fe427c34-config-data\") pod \"horizon-5b545c459d-rtjn2\" (UID: \"d0b5a237-8378-48db-b4b7-b624fe427c34\") " pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.742119 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d0b5a237-8378-48db-b4b7-b624fe427c34-horizon-secret-key\") pod \"horizon-5b545c459d-rtjn2\" (UID: \"d0b5a237-8378-48db-b4b7-b624fe427c34\") " pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.742321 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0b5a237-8378-48db-b4b7-b624fe427c34-horizon-tls-certs\") pod \"horizon-5b545c459d-rtjn2\" (UID: \"d0b5a237-8378-48db-b4b7-b624fe427c34\") " pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.742454 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b5a237-8378-48db-b4b7-b624fe427c34-combined-ca-bundle\") pod \"horizon-5b545c459d-rtjn2\" (UID: \"d0b5a237-8378-48db-b4b7-b624fe427c34\") " pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.752041 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsj4s\" (UniqueName: \"kubernetes.io/projected/d0b5a237-8378-48db-b4b7-b624fe427c34-kube-api-access-xsj4s\") pod \"horizon-5b545c459d-rtjn2\" (UID: \"d0b5a237-8378-48db-b4b7-b624fe427c34\") " pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.808971 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.835803 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c084dea-7515-4696-9e28-a5a332d954ad-scripts\") pod \"horizon-579fd4dcd4-kw727\" (UID: \"6c084dea-7515-4696-9e28-a5a332d954ad\") " pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.836087 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f72zx\" (UniqueName: \"kubernetes.io/projected/6c084dea-7515-4696-9e28-a5a332d954ad-kube-api-access-f72zx\") pod \"horizon-579fd4dcd4-kw727\" (UID: \"6c084dea-7515-4696-9e28-a5a332d954ad\") " pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.836226 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6c084dea-7515-4696-9e28-a5a332d954ad-horizon-secret-key\") pod \"horizon-579fd4dcd4-kw727\" (UID: \"6c084dea-7515-4696-9e28-a5a332d954ad\") " pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.836322 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c084dea-7515-4696-9e28-a5a332d954ad-horizon-tls-certs\") pod \"horizon-579fd4dcd4-kw727\" (UID: \"6c084dea-7515-4696-9e28-a5a332d954ad\") " pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.836511 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6c084dea-7515-4696-9e28-a5a332d954ad-config-data\") pod \"horizon-579fd4dcd4-kw727\" (UID: \"6c084dea-7515-4696-9e28-a5a332d954ad\") " pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.836715 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c084dea-7515-4696-9e28-a5a332d954ad-logs\") pod \"horizon-579fd4dcd4-kw727\" (UID: \"6c084dea-7515-4696-9e28-a5a332d954ad\") " pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.836866 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c084dea-7515-4696-9e28-a5a332d954ad-combined-ca-bundle\") pod \"horizon-579fd4dcd4-kw727\" (UID: \"6c084dea-7515-4696-9e28-a5a332d954ad\") " pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.837342 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c084dea-7515-4696-9e28-a5a332d954ad-scripts\") pod \"horizon-579fd4dcd4-kw727\" (UID: \"6c084dea-7515-4696-9e28-a5a332d954ad\") " pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.837355 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c084dea-7515-4696-9e28-a5a332d954ad-logs\") pod \"horizon-579fd4dcd4-kw727\" (UID: \"6c084dea-7515-4696-9e28-a5a332d954ad\") " pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.838125 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6c084dea-7515-4696-9e28-a5a332d954ad-config-data\") pod \"horizon-579fd4dcd4-kw727\" (UID: \"6c084dea-7515-4696-9e28-a5a332d954ad\") " pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.841282 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6c084dea-7515-4696-9e28-a5a332d954ad-horizon-secret-key\") pod \"horizon-579fd4dcd4-kw727\" (UID: \"6c084dea-7515-4696-9e28-a5a332d954ad\") " pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.841482 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c084dea-7515-4696-9e28-a5a332d954ad-horizon-tls-certs\") pod \"horizon-579fd4dcd4-kw727\" (UID: \"6c084dea-7515-4696-9e28-a5a332d954ad\") " pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.843004 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c084dea-7515-4696-9e28-a5a332d954ad-combined-ca-bundle\") pod \"horizon-579fd4dcd4-kw727\" (UID: \"6c084dea-7515-4696-9e28-a5a332d954ad\") " pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.857848 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f72zx\" (UniqueName: \"kubernetes.io/projected/6c084dea-7515-4696-9e28-a5a332d954ad-kube-api-access-f72zx\") pod \"horizon-579fd4dcd4-kw727\" (UID: \"6c084dea-7515-4696-9e28-a5a332d954ad\") " pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" Feb 28 04:30:33 crc kubenswrapper[5072]: I0228 04:30:33.886003 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" Feb 28 04:30:34 crc kubenswrapper[5072]: I0228 04:30:34.017386 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/horizon-5b545c459d-rtjn2"] Feb 28 04:30:34 crc kubenswrapper[5072]: W0228 04:30:34.029598 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0b5a237_8378_48db_b4b7_b624fe427c34.slice/crio-8e41d90dd149c33fd6774bf37200f223a494bf495882fe6dd6af11041ce7abd2 WatchSource:0}: Error finding container 8e41d90dd149c33fd6774bf37200f223a494bf495882fe6dd6af11041ce7abd2: Status 404 returned error can't find the container with id 8e41d90dd149c33fd6774bf37200f223a494bf495882fe6dd6af11041ce7abd2 Feb 28 04:30:34 crc kubenswrapper[5072]: I0228 04:30:34.108281 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/horizon-579fd4dcd4-kw727"] Feb 28 04:30:35 crc kubenswrapper[5072]: I0228 04:30:35.025185 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" event={"ID":"6c084dea-7515-4696-9e28-a5a332d954ad","Type":"ContainerStarted","Data":"568756ab15b79c1920a3e88d3bac7f5042e26f4c4c952aefd63eee0832cf7522"} Feb 28 04:30:35 crc kubenswrapper[5072]: I0228 04:30:35.025706 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" event={"ID":"6c084dea-7515-4696-9e28-a5a332d954ad","Type":"ContainerStarted","Data":"c05e13d0e3c4c8f2d53d1828f88eb971dd380430790461ab48790cf4a36c30ea"} Feb 28 04:30:35 crc kubenswrapper[5072]: I0228 04:30:35.025719 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" event={"ID":"6c084dea-7515-4696-9e28-a5a332d954ad","Type":"ContainerStarted","Data":"57fd5d55e3147c2ecc214500975bb4f852857a45c5ffedcfe16fad7a9815ff60"} Feb 28 04:30:35 crc kubenswrapper[5072]: I0228 04:30:35.026802 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" event={"ID":"d0b5a237-8378-48db-b4b7-b624fe427c34","Type":"ContainerStarted","Data":"6cda1568b5fdb000bfdf309b5a826a54d00f75e7aa1101d47c352ec97792f81f"} Feb 28 04:30:35 crc kubenswrapper[5072]: I0228 04:30:35.026837 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" event={"ID":"d0b5a237-8378-48db-b4b7-b624fe427c34","Type":"ContainerStarted","Data":"6188312132e61e01ceba9028d903a90f7e8327ac232362763ecfa9f41ffcfd93"} Feb 28 04:30:35 crc kubenswrapper[5072]: I0228 04:30:35.026851 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" event={"ID":"d0b5a237-8378-48db-b4b7-b624fe427c34","Type":"ContainerStarted","Data":"8e41d90dd149c33fd6774bf37200f223a494bf495882fe6dd6af11041ce7abd2"} Feb 28 04:30:35 crc kubenswrapper[5072]: I0228 04:30:35.044913 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" podStartSLOduration=2.044889191 podStartE2EDuration="2.044889191s" podCreationTimestamp="2026-02-28 04:30:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:30:35.044660864 +0000 UTC m=+1257.039391076" watchObservedRunningTime="2026-02-28 04:30:35.044889191 +0000 UTC m=+1257.039619393" Feb 28 04:30:35 crc kubenswrapper[5072]: I0228 04:30:35.064791 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" podStartSLOduration=2.064766815 podStartE2EDuration="2.064766815s" podCreationTimestamp="2026-02-28 04:30:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:30:35.063005839 +0000 UTC m=+1257.057736051" watchObservedRunningTime="2026-02-28 04:30:35.064766815 +0000 UTC m=+1257.059497007" Feb 28 04:30:43 crc kubenswrapper[5072]: I0228 04:30:43.809459 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" Feb 28 04:30:43 crc kubenswrapper[5072]: I0228 04:30:43.809940 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" Feb 28 04:30:43 crc kubenswrapper[5072]: I0228 04:30:43.886559 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" Feb 28 04:30:43 crc kubenswrapper[5072]: I0228 04:30:43.886607 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" Feb 28 04:30:53 crc kubenswrapper[5072]: I0228 04:30:53.817823 5072 prober.go:107] "Probe failed" probeType="Startup" pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" podUID="d0b5a237-8378-48db-b4b7-b624fe427c34" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.93:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.93:8443: connect: connection refused" Feb 28 04:30:53 crc kubenswrapper[5072]: I0228 04:30:53.888203 5072 prober.go:107] "Probe failed" probeType="Startup" pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" podUID="6c084dea-7515-4696-9e28-a5a332d954ad" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.94:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.94:8443: connect: connection refused" Feb 28 04:31:05 crc kubenswrapper[5072]: I0228 04:31:05.555081 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" Feb 28 04:31:05 crc kubenswrapper[5072]: I0228 04:31:05.647224 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" Feb 28 04:31:07 crc kubenswrapper[5072]: I0228 04:31:07.254516 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" Feb 28 04:31:07 crc kubenswrapper[5072]: I0228 04:31:07.278448 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" Feb 28 04:31:07 crc kubenswrapper[5072]: I0228 04:31:07.381240 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/horizon-5b545c459d-rtjn2"] Feb 28 04:31:07 crc kubenswrapper[5072]: I0228 04:31:07.406189 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" podUID="d0b5a237-8378-48db-b4b7-b624fe427c34" containerName="horizon-log" containerID="cri-o://6188312132e61e01ceba9028d903a90f7e8327ac232362763ecfa9f41ffcfd93" gracePeriod=30 Feb 28 04:31:07 crc kubenswrapper[5072]: I0228 04:31:07.406988 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" podUID="d0b5a237-8378-48db-b4b7-b624fe427c34" containerName="horizon" containerID="cri-o://6cda1568b5fdb000bfdf309b5a826a54d00f75e7aa1101d47c352ec97792f81f" gracePeriod=30 Feb 28 04:31:07 crc kubenswrapper[5072]: I0228 04:31:07.961418 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/horizon-579fd4dcd4-kw727"] Feb 28 04:31:07 crc kubenswrapper[5072]: I0228 04:31:07.961793 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" podUID="6c084dea-7515-4696-9e28-a5a332d954ad" containerName="horizon-log" containerID="cri-o://c05e13d0e3c4c8f2d53d1828f88eb971dd380430790461ab48790cf4a36c30ea" gracePeriod=30 Feb 28 04:31:07 crc kubenswrapper[5072]: I0228 04:31:07.961856 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" podUID="6c084dea-7515-4696-9e28-a5a332d954ad" containerName="horizon" containerID="cri-o://568756ab15b79c1920a3e88d3bac7f5042e26f4c4c952aefd63eee0832cf7522" gracePeriod=30 Feb 28 04:31:11 crc kubenswrapper[5072]: I0228 04:31:11.432863 5072 generic.go:334] "Generic (PLEG): container finished" podID="d0b5a237-8378-48db-b4b7-b624fe427c34" containerID="6cda1568b5fdb000bfdf309b5a826a54d00f75e7aa1101d47c352ec97792f81f" exitCode=0 Feb 28 04:31:11 crc kubenswrapper[5072]: I0228 04:31:11.432938 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" event={"ID":"d0b5a237-8378-48db-b4b7-b624fe427c34","Type":"ContainerDied","Data":"6cda1568b5fdb000bfdf309b5a826a54d00f75e7aa1101d47c352ec97792f81f"} Feb 28 04:31:11 crc kubenswrapper[5072]: I0228 04:31:11.436098 5072 generic.go:334] "Generic (PLEG): container finished" podID="6c084dea-7515-4696-9e28-a5a332d954ad" containerID="568756ab15b79c1920a3e88d3bac7f5042e26f4c4c952aefd63eee0832cf7522" exitCode=0 Feb 28 04:31:11 crc kubenswrapper[5072]: I0228 04:31:11.436190 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" event={"ID":"6c084dea-7515-4696-9e28-a5a332d954ad","Type":"ContainerDied","Data":"568756ab15b79c1920a3e88d3bac7f5042e26f4c4c952aefd63eee0832cf7522"} Feb 28 04:31:13 crc kubenswrapper[5072]: I0228 04:31:13.809726 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" podUID="d0b5a237-8378-48db-b4b7-b624fe427c34" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.93:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.93:8443: connect: connection refused" Feb 28 04:31:13 crc kubenswrapper[5072]: I0228 04:31:13.887473 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" podUID="6c084dea-7515-4696-9e28-a5a332d954ad" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.94:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.94:8443: connect: connection refused" Feb 28 04:31:23 crc kubenswrapper[5072]: I0228 04:31:23.811004 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" podUID="d0b5a237-8378-48db-b4b7-b624fe427c34" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.93:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.93:8443: connect: connection refused" Feb 28 04:31:23 crc kubenswrapper[5072]: I0228 04:31:23.887482 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" podUID="6c084dea-7515-4696-9e28-a5a332d954ad" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.94:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.94:8443: connect: connection refused" Feb 28 04:31:33 crc kubenswrapper[5072]: I0228 04:31:33.810362 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" podUID="d0b5a237-8378-48db-b4b7-b624fe427c34" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.93:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.93:8443: connect: connection refused" Feb 28 04:31:33 crc kubenswrapper[5072]: I0228 04:31:33.810988 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" Feb 28 04:31:33 crc kubenswrapper[5072]: I0228 04:31:33.891155 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" podUID="6c084dea-7515-4696-9e28-a5a332d954ad" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.94:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.94:8443: connect: connection refused" Feb 28 04:31:33 crc kubenswrapper[5072]: I0228 04:31:33.891314 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" Feb 28 04:31:37 crc kubenswrapper[5072]: I0228 04:31:37.717080 5072 generic.go:334] "Generic (PLEG): container finished" podID="d0b5a237-8378-48db-b4b7-b624fe427c34" containerID="6188312132e61e01ceba9028d903a90f7e8327ac232362763ecfa9f41ffcfd93" exitCode=137 Feb 28 04:31:37 crc kubenswrapper[5072]: I0228 04:31:37.717273 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" event={"ID":"d0b5a237-8378-48db-b4b7-b624fe427c34","Type":"ContainerDied","Data":"6188312132e61e01ceba9028d903a90f7e8327ac232362763ecfa9f41ffcfd93"} Feb 28 04:31:37 crc kubenswrapper[5072]: I0228 04:31:37.717416 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" event={"ID":"d0b5a237-8378-48db-b4b7-b624fe427c34","Type":"ContainerDied","Data":"8e41d90dd149c33fd6774bf37200f223a494bf495882fe6dd6af11041ce7abd2"} Feb 28 04:31:37 crc kubenswrapper[5072]: I0228 04:31:37.717431 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e41d90dd149c33fd6774bf37200f223a494bf495882fe6dd6af11041ce7abd2" Feb 28 04:31:37 crc kubenswrapper[5072]: I0228 04:31:37.764995 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" Feb 28 04:31:37 crc kubenswrapper[5072]: I0228 04:31:37.885374 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0b5a237-8378-48db-b4b7-b624fe427c34-scripts\") pod \"d0b5a237-8378-48db-b4b7-b624fe427c34\" (UID: \"d0b5a237-8378-48db-b4b7-b624fe427c34\") " Feb 28 04:31:37 crc kubenswrapper[5072]: I0228 04:31:37.885471 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0b5a237-8378-48db-b4b7-b624fe427c34-logs\") pod \"d0b5a237-8378-48db-b4b7-b624fe427c34\" (UID: \"d0b5a237-8378-48db-b4b7-b624fe427c34\") " Feb 28 04:31:37 crc kubenswrapper[5072]: I0228 04:31:37.885490 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b5a237-8378-48db-b4b7-b624fe427c34-combined-ca-bundle\") pod \"d0b5a237-8378-48db-b4b7-b624fe427c34\" (UID: \"d0b5a237-8378-48db-b4b7-b624fe427c34\") " Feb 28 04:31:37 crc kubenswrapper[5072]: I0228 04:31:37.885513 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0b5a237-8378-48db-b4b7-b624fe427c34-horizon-tls-certs\") pod \"d0b5a237-8378-48db-b4b7-b624fe427c34\" (UID: \"d0b5a237-8378-48db-b4b7-b624fe427c34\") " Feb 28 04:31:37 crc kubenswrapper[5072]: I0228 04:31:37.885559 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d0b5a237-8378-48db-b4b7-b624fe427c34-horizon-secret-key\") pod \"d0b5a237-8378-48db-b4b7-b624fe427c34\" (UID: \"d0b5a237-8378-48db-b4b7-b624fe427c34\") " Feb 28 04:31:37 crc kubenswrapper[5072]: I0228 04:31:37.885595 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0b5a237-8378-48db-b4b7-b624fe427c34-config-data\") pod \"d0b5a237-8378-48db-b4b7-b624fe427c34\" (UID: \"d0b5a237-8378-48db-b4b7-b624fe427c34\") " Feb 28 04:31:37 crc kubenswrapper[5072]: I0228 04:31:37.885669 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsj4s\" (UniqueName: \"kubernetes.io/projected/d0b5a237-8378-48db-b4b7-b624fe427c34-kube-api-access-xsj4s\") pod \"d0b5a237-8378-48db-b4b7-b624fe427c34\" (UID: \"d0b5a237-8378-48db-b4b7-b624fe427c34\") " Feb 28 04:31:37 crc kubenswrapper[5072]: I0228 04:31:37.886126 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0b5a237-8378-48db-b4b7-b624fe427c34-logs" (OuterVolumeSpecName: "logs") pod "d0b5a237-8378-48db-b4b7-b624fe427c34" (UID: "d0b5a237-8378-48db-b4b7-b624fe427c34"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:31:37 crc kubenswrapper[5072]: I0228 04:31:37.891842 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0b5a237-8378-48db-b4b7-b624fe427c34-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d0b5a237-8378-48db-b4b7-b624fe427c34" (UID: "d0b5a237-8378-48db-b4b7-b624fe427c34"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:31:37 crc kubenswrapper[5072]: I0228 04:31:37.891843 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0b5a237-8378-48db-b4b7-b624fe427c34-kube-api-access-xsj4s" (OuterVolumeSpecName: "kube-api-access-xsj4s") pod "d0b5a237-8378-48db-b4b7-b624fe427c34" (UID: "d0b5a237-8378-48db-b4b7-b624fe427c34"). InnerVolumeSpecName "kube-api-access-xsj4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:31:37 crc kubenswrapper[5072]: I0228 04:31:37.905423 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0b5a237-8378-48db-b4b7-b624fe427c34-config-data" (OuterVolumeSpecName: "config-data") pod "d0b5a237-8378-48db-b4b7-b624fe427c34" (UID: "d0b5a237-8378-48db-b4b7-b624fe427c34"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:31:37 crc kubenswrapper[5072]: I0228 04:31:37.906253 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0b5a237-8378-48db-b4b7-b624fe427c34-scripts" (OuterVolumeSpecName: "scripts") pod "d0b5a237-8378-48db-b4b7-b624fe427c34" (UID: "d0b5a237-8378-48db-b4b7-b624fe427c34"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:31:37 crc kubenswrapper[5072]: I0228 04:31:37.908724 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0b5a237-8378-48db-b4b7-b624fe427c34-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0b5a237-8378-48db-b4b7-b624fe427c34" (UID: "d0b5a237-8378-48db-b4b7-b624fe427c34"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:31:37 crc kubenswrapper[5072]: I0228 04:31:37.926635 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0b5a237-8378-48db-b4b7-b624fe427c34-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "d0b5a237-8378-48db-b4b7-b624fe427c34" (UID: "d0b5a237-8378-48db-b4b7-b624fe427c34"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:31:37 crc kubenswrapper[5072]: I0228 04:31:37.987112 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsj4s\" (UniqueName: \"kubernetes.io/projected/d0b5a237-8378-48db-b4b7-b624fe427c34-kube-api-access-xsj4s\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:37 crc kubenswrapper[5072]: I0228 04:31:37.987142 5072 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0b5a237-8378-48db-b4b7-b624fe427c34-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:37 crc kubenswrapper[5072]: I0228 04:31:37.987152 5072 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0b5a237-8378-48db-b4b7-b624fe427c34-logs\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:37 crc kubenswrapper[5072]: I0228 04:31:37.987162 5072 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b5a237-8378-48db-b4b7-b624fe427c34-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:37 crc kubenswrapper[5072]: I0228 04:31:37.987170 5072 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0b5a237-8378-48db-b4b7-b624fe427c34-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:37 crc kubenswrapper[5072]: I0228 04:31:37.987178 5072 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d0b5a237-8378-48db-b4b7-b624fe427c34-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:37 crc kubenswrapper[5072]: I0228 04:31:37.987187 5072 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0b5a237-8378-48db-b4b7-b624fe427c34-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.234533 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.392725 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f72zx\" (UniqueName: \"kubernetes.io/projected/6c084dea-7515-4696-9e28-a5a332d954ad-kube-api-access-f72zx\") pod \"6c084dea-7515-4696-9e28-a5a332d954ad\" (UID: \"6c084dea-7515-4696-9e28-a5a332d954ad\") " Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.392892 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6c084dea-7515-4696-9e28-a5a332d954ad-horizon-secret-key\") pod \"6c084dea-7515-4696-9e28-a5a332d954ad\" (UID: \"6c084dea-7515-4696-9e28-a5a332d954ad\") " Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.392961 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c084dea-7515-4696-9e28-a5a332d954ad-combined-ca-bundle\") pod \"6c084dea-7515-4696-9e28-a5a332d954ad\" (UID: \"6c084dea-7515-4696-9e28-a5a332d954ad\") " Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.393008 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c084dea-7515-4696-9e28-a5a332d954ad-logs\") pod \"6c084dea-7515-4696-9e28-a5a332d954ad\" (UID: \"6c084dea-7515-4696-9e28-a5a332d954ad\") " Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.393047 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c084dea-7515-4696-9e28-a5a332d954ad-scripts\") pod \"6c084dea-7515-4696-9e28-a5a332d954ad\" (UID: \"6c084dea-7515-4696-9e28-a5a332d954ad\") " Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.393140 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6c084dea-7515-4696-9e28-a5a332d954ad-config-data\") pod \"6c084dea-7515-4696-9e28-a5a332d954ad\" (UID: \"6c084dea-7515-4696-9e28-a5a332d954ad\") " Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.393191 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c084dea-7515-4696-9e28-a5a332d954ad-horizon-tls-certs\") pod \"6c084dea-7515-4696-9e28-a5a332d954ad\" (UID: \"6c084dea-7515-4696-9e28-a5a332d954ad\") " Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.393845 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c084dea-7515-4696-9e28-a5a332d954ad-logs" (OuterVolumeSpecName: "logs") pod "6c084dea-7515-4696-9e28-a5a332d954ad" (UID: "6c084dea-7515-4696-9e28-a5a332d954ad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.397107 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c084dea-7515-4696-9e28-a5a332d954ad-kube-api-access-f72zx" (OuterVolumeSpecName: "kube-api-access-f72zx") pod "6c084dea-7515-4696-9e28-a5a332d954ad" (UID: "6c084dea-7515-4696-9e28-a5a332d954ad"). InnerVolumeSpecName "kube-api-access-f72zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.399119 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c084dea-7515-4696-9e28-a5a332d954ad-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "6c084dea-7515-4696-9e28-a5a332d954ad" (UID: "6c084dea-7515-4696-9e28-a5a332d954ad"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.414164 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c084dea-7515-4696-9e28-a5a332d954ad-scripts" (OuterVolumeSpecName: "scripts") pod "6c084dea-7515-4696-9e28-a5a332d954ad" (UID: "6c084dea-7515-4696-9e28-a5a332d954ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.414270 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c084dea-7515-4696-9e28-a5a332d954ad-config-data" (OuterVolumeSpecName: "config-data") pod "6c084dea-7515-4696-9e28-a5a332d954ad" (UID: "6c084dea-7515-4696-9e28-a5a332d954ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.414481 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c084dea-7515-4696-9e28-a5a332d954ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c084dea-7515-4696-9e28-a5a332d954ad" (UID: "6c084dea-7515-4696-9e28-a5a332d954ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.425766 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c084dea-7515-4696-9e28-a5a332d954ad-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "6c084dea-7515-4696-9e28-a5a332d954ad" (UID: "6c084dea-7515-4696-9e28-a5a332d954ad"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.495520 5072 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c084dea-7515-4696-9e28-a5a332d954ad-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.495582 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f72zx\" (UniqueName: \"kubernetes.io/projected/6c084dea-7515-4696-9e28-a5a332d954ad-kube-api-access-f72zx\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.495604 5072 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6c084dea-7515-4696-9e28-a5a332d954ad-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.495624 5072 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c084dea-7515-4696-9e28-a5a332d954ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.495676 5072 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c084dea-7515-4696-9e28-a5a332d954ad-logs\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.495706 5072 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c084dea-7515-4696-9e28-a5a332d954ad-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.495725 5072 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6c084dea-7515-4696-9e28-a5a332d954ad-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.729498 5072 generic.go:334] "Generic (PLEG): container finished" podID="6c084dea-7515-4696-9e28-a5a332d954ad" containerID="c05e13d0e3c4c8f2d53d1828f88eb971dd380430790461ab48790cf4a36c30ea" exitCode=137 Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.729582 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.730062 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-5b545c459d-rtjn2" Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.729615 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" event={"ID":"6c084dea-7515-4696-9e28-a5a332d954ad","Type":"ContainerDied","Data":"c05e13d0e3c4c8f2d53d1828f88eb971dd380430790461ab48790cf4a36c30ea"} Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.730132 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-579fd4dcd4-kw727" event={"ID":"6c084dea-7515-4696-9e28-a5a332d954ad","Type":"ContainerDied","Data":"57fd5d55e3147c2ecc214500975bb4f852857a45c5ffedcfe16fad7a9815ff60"} Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.730159 5072 scope.go:117] "RemoveContainer" containerID="568756ab15b79c1920a3e88d3bac7f5042e26f4c4c952aefd63eee0832cf7522" Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.774972 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/horizon-579fd4dcd4-kw727"] Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.783169 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/horizon-579fd4dcd4-kw727"] Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.787602 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/horizon-5b545c459d-rtjn2"] Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.792000 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/horizon-5b545c459d-rtjn2"] Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.951920 5072 scope.go:117] "RemoveContainer" containerID="c05e13d0e3c4c8f2d53d1828f88eb971dd380430790461ab48790cf4a36c30ea" Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.976817 5072 scope.go:117] "RemoveContainer" containerID="568756ab15b79c1920a3e88d3bac7f5042e26f4c4c952aefd63eee0832cf7522" Feb 28 04:31:38 crc kubenswrapper[5072]: E0228 04:31:38.977387 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"568756ab15b79c1920a3e88d3bac7f5042e26f4c4c952aefd63eee0832cf7522\": container with ID starting with 568756ab15b79c1920a3e88d3bac7f5042e26f4c4c952aefd63eee0832cf7522 not found: ID does not exist" containerID="568756ab15b79c1920a3e88d3bac7f5042e26f4c4c952aefd63eee0832cf7522" Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.977438 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"568756ab15b79c1920a3e88d3bac7f5042e26f4c4c952aefd63eee0832cf7522"} err="failed to get container status \"568756ab15b79c1920a3e88d3bac7f5042e26f4c4c952aefd63eee0832cf7522\": rpc error: code = NotFound desc = could not find container \"568756ab15b79c1920a3e88d3bac7f5042e26f4c4c952aefd63eee0832cf7522\": container with ID starting with 568756ab15b79c1920a3e88d3bac7f5042e26f4c4c952aefd63eee0832cf7522 not found: ID does not exist" Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.977472 5072 scope.go:117] "RemoveContainer" containerID="c05e13d0e3c4c8f2d53d1828f88eb971dd380430790461ab48790cf4a36c30ea" Feb 28 04:31:38 crc kubenswrapper[5072]: E0228 04:31:38.977927 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c05e13d0e3c4c8f2d53d1828f88eb971dd380430790461ab48790cf4a36c30ea\": container with ID starting with c05e13d0e3c4c8f2d53d1828f88eb971dd380430790461ab48790cf4a36c30ea not found: ID does not exist" containerID="c05e13d0e3c4c8f2d53d1828f88eb971dd380430790461ab48790cf4a36c30ea" Feb 28 04:31:38 crc kubenswrapper[5072]: I0228 04:31:38.977967 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c05e13d0e3c4c8f2d53d1828f88eb971dd380430790461ab48790cf4a36c30ea"} err="failed to get container status \"c05e13d0e3c4c8f2d53d1828f88eb971dd380430790461ab48790cf4a36c30ea\": rpc error: code = NotFound desc = could not find container \"c05e13d0e3c4c8f2d53d1828f88eb971dd380430790461ab48790cf4a36c30ea\": container with ID starting with c05e13d0e3c4c8f2d53d1828f88eb971dd380430790461ab48790cf4a36c30ea not found: ID does not exist" Feb 28 04:31:40 crc kubenswrapper[5072]: I0228 04:31:40.669530 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c084dea-7515-4696-9e28-a5a332d954ad" path="/var/lib/kubelet/pods/6c084dea-7515-4696-9e28-a5a332d954ad/volumes" Feb 28 04:31:40 crc kubenswrapper[5072]: I0228 04:31:40.670865 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0b5a237-8378-48db-b4b7-b624fe427c34" path="/var/lib/kubelet/pods/d0b5a237-8378-48db-b4b7-b624fe427c34/volumes" Feb 28 04:31:45 crc kubenswrapper[5072]: I0228 04:31:45.401804 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/keystone-db-sync-lxxwv"] Feb 28 04:31:45 crc kubenswrapper[5072]: I0228 04:31:45.416048 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/keystone-bootstrap-gffh8"] Feb 28 04:31:45 crc kubenswrapper[5072]: I0228 04:31:45.426035 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/keystone-bootstrap-gffh8"] Feb 28 04:31:45 crc kubenswrapper[5072]: I0228 04:31:45.437709 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/keystone-db-sync-lxxwv"] Feb 28 04:31:45 crc kubenswrapper[5072]: I0228 04:31:45.447070 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/keystone-577bcf6dcc-r9srn"] Feb 28 04:31:45 crc kubenswrapper[5072]: I0228 04:31:45.447402 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/keystone-577bcf6dcc-r9srn" podUID="fab191f5-56a2-4b06-88be-14286e763b52" containerName="keystone-api" containerID="cri-o://d9899680691ab4bc6ab96c216a2101cf16f665914dcec5c175418bc070483704" gracePeriod=30 Feb 28 04:31:45 crc kubenswrapper[5072]: I0228 04:31:45.468168 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/keystonef9dc-account-delete-9skqx"] Feb 28 04:31:45 crc kubenswrapper[5072]: E0228 04:31:45.468466 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0b5a237-8378-48db-b4b7-b624fe427c34" containerName="horizon" Feb 28 04:31:45 crc kubenswrapper[5072]: I0228 04:31:45.468483 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b5a237-8378-48db-b4b7-b624fe427c34" containerName="horizon" Feb 28 04:31:45 crc kubenswrapper[5072]: E0228 04:31:45.468495 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c084dea-7515-4696-9e28-a5a332d954ad" containerName="horizon-log" Feb 28 04:31:45 crc kubenswrapper[5072]: I0228 04:31:45.468501 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c084dea-7515-4696-9e28-a5a332d954ad" containerName="horizon-log" Feb 28 04:31:45 crc kubenswrapper[5072]: E0228 04:31:45.468516 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c084dea-7515-4696-9e28-a5a332d954ad" containerName="horizon" Feb 28 04:31:45 crc kubenswrapper[5072]: I0228 04:31:45.468522 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c084dea-7515-4696-9e28-a5a332d954ad" containerName="horizon" Feb 28 04:31:45 crc kubenswrapper[5072]: E0228 04:31:45.468533 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0b5a237-8378-48db-b4b7-b624fe427c34" containerName="horizon-log" Feb 28 04:31:45 crc kubenswrapper[5072]: I0228 04:31:45.468540 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b5a237-8378-48db-b4b7-b624fe427c34" containerName="horizon-log" Feb 28 04:31:45 crc kubenswrapper[5072]: I0228 04:31:45.468698 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c084dea-7515-4696-9e28-a5a332d954ad" containerName="horizon-log" Feb 28 04:31:45 crc kubenswrapper[5072]: I0228 04:31:45.468709 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c084dea-7515-4696-9e28-a5a332d954ad" containerName="horizon" Feb 28 04:31:45 crc kubenswrapper[5072]: I0228 04:31:45.468721 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0b5a237-8378-48db-b4b7-b624fe427c34" containerName="horizon" Feb 28 04:31:45 crc kubenswrapper[5072]: I0228 04:31:45.468728 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0b5a237-8378-48db-b4b7-b624fe427c34" containerName="horizon-log" Feb 28 04:31:45 crc kubenswrapper[5072]: I0228 04:31:45.469165 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystonef9dc-account-delete-9skqx" Feb 28 04:31:45 crc kubenswrapper[5072]: I0228 04:31:45.481890 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystonef9dc-account-delete-9skqx"] Feb 28 04:31:45 crc kubenswrapper[5072]: I0228 04:31:45.605540 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk79r\" (UniqueName: \"kubernetes.io/projected/a7ac4fd8-0666-458f-8910-26abc36f0bdd-kube-api-access-jk79r\") pod \"keystonef9dc-account-delete-9skqx\" (UID: \"a7ac4fd8-0666-458f-8910-26abc36f0bdd\") " pod="horizon-kuttl-tests/keystonef9dc-account-delete-9skqx" Feb 28 04:31:45 crc kubenswrapper[5072]: I0228 04:31:45.605777 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7ac4fd8-0666-458f-8910-26abc36f0bdd-operator-scripts\") pod \"keystonef9dc-account-delete-9skqx\" (UID: \"a7ac4fd8-0666-458f-8910-26abc36f0bdd\") " pod="horizon-kuttl-tests/keystonef9dc-account-delete-9skqx" Feb 28 04:31:45 crc kubenswrapper[5072]: I0228 04:31:45.707164 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7ac4fd8-0666-458f-8910-26abc36f0bdd-operator-scripts\") pod \"keystonef9dc-account-delete-9skqx\" (UID: \"a7ac4fd8-0666-458f-8910-26abc36f0bdd\") " pod="horizon-kuttl-tests/keystonef9dc-account-delete-9skqx" Feb 28 04:31:45 crc kubenswrapper[5072]: I0228 04:31:45.707376 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk79r\" (UniqueName: \"kubernetes.io/projected/a7ac4fd8-0666-458f-8910-26abc36f0bdd-kube-api-access-jk79r\") pod \"keystonef9dc-account-delete-9skqx\" (UID: \"a7ac4fd8-0666-458f-8910-26abc36f0bdd\") " pod="horizon-kuttl-tests/keystonef9dc-account-delete-9skqx" Feb 28 04:31:45 crc kubenswrapper[5072]: I0228 04:31:45.708125 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7ac4fd8-0666-458f-8910-26abc36f0bdd-operator-scripts\") pod \"keystonef9dc-account-delete-9skqx\" (UID: \"a7ac4fd8-0666-458f-8910-26abc36f0bdd\") " pod="horizon-kuttl-tests/keystonef9dc-account-delete-9skqx" Feb 28 04:31:45 crc kubenswrapper[5072]: I0228 04:31:45.732269 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk79r\" (UniqueName: \"kubernetes.io/projected/a7ac4fd8-0666-458f-8910-26abc36f0bdd-kube-api-access-jk79r\") pod \"keystonef9dc-account-delete-9skqx\" (UID: \"a7ac4fd8-0666-458f-8910-26abc36f0bdd\") " pod="horizon-kuttl-tests/keystonef9dc-account-delete-9skqx" Feb 28 04:31:45 crc kubenswrapper[5072]: I0228 04:31:45.796789 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystonef9dc-account-delete-9skqx" Feb 28 04:31:46 crc kubenswrapper[5072]: I0228 04:31:46.231860 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/root-account-create-update-kjjsp"] Feb 28 04:31:46 crc kubenswrapper[5072]: I0228 04:31:46.247305 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/root-account-create-update-kjjsp"] Feb 28 04:31:46 crc kubenswrapper[5072]: I0228 04:31:46.287229 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/openstack-galera-1"] Feb 28 04:31:46 crc kubenswrapper[5072]: I0228 04:31:46.305749 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/openstack-galera-0"] Feb 28 04:31:46 crc kubenswrapper[5072]: I0228 04:31:46.336551 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/openstack-galera-2"] Feb 28 04:31:46 crc kubenswrapper[5072]: I0228 04:31:46.367711 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystonef9dc-account-delete-9skqx"] Feb 28 04:31:46 crc kubenswrapper[5072]: I0228 04:31:46.489775 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/openstack-galera-2" podUID="9c25f535-2cfb-40b6-9412-9888a0fc1975" containerName="galera" containerID="cri-o://97553569d9278a79ed1a5d79f9d3aa79d091aabb454a56adbe2614f1c6dc6f9e" gracePeriod=30 Feb 28 04:31:46 crc kubenswrapper[5072]: I0228 04:31:46.667025 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17c3258d-a634-499a-98b2-3bf41a18a4b1" path="/var/lib/kubelet/pods/17c3258d-a634-499a-98b2-3bf41a18a4b1/volumes" Feb 28 04:31:46 crc kubenswrapper[5072]: I0228 04:31:46.667907 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e775589-8da9-4347-925a-3ef4b5fb7e28" path="/var/lib/kubelet/pods/3e775589-8da9-4347-925a-3ef4b5fb7e28/volumes" Feb 28 04:31:46 crc kubenswrapper[5072]: I0228 04:31:46.668475 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f489d842-f259-4a58-af65-16f8d04dfa07" path="/var/lib/kubelet/pods/f489d842-f259-4a58-af65-16f8d04dfa07/volumes" Feb 28 04:31:46 crc kubenswrapper[5072]: I0228 04:31:46.794315 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystonef9dc-account-delete-9skqx" event={"ID":"a7ac4fd8-0666-458f-8910-26abc36f0bdd","Type":"ContainerStarted","Data":"42d36128e08c118929604145ccb16de9ac635efe97f4c9d81c2a7c93fbba0995"} Feb 28 04:31:46 crc kubenswrapper[5072]: I0228 04:31:46.794610 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystonef9dc-account-delete-9skqx" event={"ID":"a7ac4fd8-0666-458f-8910-26abc36f0bdd","Type":"ContainerStarted","Data":"123a786d3992065b81eb2ae8f45271411a0abbb965f7a496c0cc1281f86f8689"} Feb 28 04:31:46 crc kubenswrapper[5072]: I0228 04:31:46.794741 5072 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="horizon-kuttl-tests/keystonef9dc-account-delete-9skqx" secret="" err="secret \"galera-openstack-dockercfg-98rq2\" not found" Feb 28 04:31:46 crc kubenswrapper[5072]: I0228 04:31:46.811235 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/keystonef9dc-account-delete-9skqx" podStartSLOduration=1.811216395 podStartE2EDuration="1.811216395s" podCreationTimestamp="2026-02-28 04:31:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:31:46.806090284 +0000 UTC m=+1328.800820466" watchObservedRunningTime="2026-02-28 04:31:46.811216395 +0000 UTC m=+1328.805946587" Feb 28 04:31:46 crc kubenswrapper[5072]: I0228 04:31:46.873351 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/memcached-0"] Feb 28 04:31:46 crc kubenswrapper[5072]: I0228 04:31:46.873562 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/memcached-0" podUID="2da690cd-386a-45cf-89a9-4d5a02218af4" containerName="memcached" containerID="cri-o://bf5ad711c4005244241e7c346d2c1a7ef5e3506d23aa2bcbcf1e62402a2e81f5" gracePeriod=30 Feb 28 04:31:46 crc kubenswrapper[5072]: E0228 04:31:46.933009 5072 configmap.go:193] Couldn't get configMap horizon-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Feb 28 04:31:46 crc kubenswrapper[5072]: E0228 04:31:46.933112 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a7ac4fd8-0666-458f-8910-26abc36f0bdd-operator-scripts podName:a7ac4fd8-0666-458f-8910-26abc36f0bdd nodeName:}" failed. No retries permitted until 2026-02-28 04:31:47.433090847 +0000 UTC m=+1329.427821119 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a7ac4fd8-0666-458f-8910-26abc36f0bdd-operator-scripts") pod "keystonef9dc-account-delete-9skqx" (UID: "a7ac4fd8-0666-458f-8910-26abc36f0bdd") : configmap "openstack-scripts" not found Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.253026 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-2" Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.338502 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c25f535-2cfb-40b6-9412-9888a0fc1975-operator-scripts\") pod \"9c25f535-2cfb-40b6-9412-9888a0fc1975\" (UID: \"9c25f535-2cfb-40b6-9412-9888a0fc1975\") " Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.338605 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9c25f535-2cfb-40b6-9412-9888a0fc1975-kolla-config\") pod \"9c25f535-2cfb-40b6-9412-9888a0fc1975\" (UID: \"9c25f535-2cfb-40b6-9412-9888a0fc1975\") " Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.338635 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9c25f535-2cfb-40b6-9412-9888a0fc1975-config-data-default\") pod \"9c25f535-2cfb-40b6-9412-9888a0fc1975\" (UID: \"9c25f535-2cfb-40b6-9412-9888a0fc1975\") " Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.338716 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68nsk\" (UniqueName: \"kubernetes.io/projected/9c25f535-2cfb-40b6-9412-9888a0fc1975-kube-api-access-68nsk\") pod \"9c25f535-2cfb-40b6-9412-9888a0fc1975\" (UID: \"9c25f535-2cfb-40b6-9412-9888a0fc1975\") " Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.338740 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9c25f535-2cfb-40b6-9412-9888a0fc1975-config-data-generated\") pod \"9c25f535-2cfb-40b6-9412-9888a0fc1975\" (UID: \"9c25f535-2cfb-40b6-9412-9888a0fc1975\") " Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.338756 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"9c25f535-2cfb-40b6-9412-9888a0fc1975\" (UID: \"9c25f535-2cfb-40b6-9412-9888a0fc1975\") " Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.339534 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c25f535-2cfb-40b6-9412-9888a0fc1975-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "9c25f535-2cfb-40b6-9412-9888a0fc1975" (UID: "9c25f535-2cfb-40b6-9412-9888a0fc1975"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.339547 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c25f535-2cfb-40b6-9412-9888a0fc1975-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "9c25f535-2cfb-40b6-9412-9888a0fc1975" (UID: "9c25f535-2cfb-40b6-9412-9888a0fc1975"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.339733 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c25f535-2cfb-40b6-9412-9888a0fc1975-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9c25f535-2cfb-40b6-9412-9888a0fc1975" (UID: "9c25f535-2cfb-40b6-9412-9888a0fc1975"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.339896 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c25f535-2cfb-40b6-9412-9888a0fc1975-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "9c25f535-2cfb-40b6-9412-9888a0fc1975" (UID: "9c25f535-2cfb-40b6-9412-9888a0fc1975"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.340189 5072 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c25f535-2cfb-40b6-9412-9888a0fc1975-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.340210 5072 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9c25f535-2cfb-40b6-9412-9888a0fc1975-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.340219 5072 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9c25f535-2cfb-40b6-9412-9888a0fc1975-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.340228 5072 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9c25f535-2cfb-40b6-9412-9888a0fc1975-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.343875 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c25f535-2cfb-40b6-9412-9888a0fc1975-kube-api-access-68nsk" (OuterVolumeSpecName: "kube-api-access-68nsk") pod "9c25f535-2cfb-40b6-9412-9888a0fc1975" (UID: "9c25f535-2cfb-40b6-9412-9888a0fc1975"). InnerVolumeSpecName "kube-api-access-68nsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.349049 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "mysql-db") pod "9c25f535-2cfb-40b6-9412-9888a0fc1975" (UID: "9c25f535-2cfb-40b6-9412-9888a0fc1975"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.394671 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/rabbitmq-server-0"] Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.441944 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68nsk\" (UniqueName: \"kubernetes.io/projected/9c25f535-2cfb-40b6-9412-9888a0fc1975-kube-api-access-68nsk\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:47 crc kubenswrapper[5072]: E0228 04:31:47.441983 5072 configmap.go:193] Couldn't get configMap horizon-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Feb 28 04:31:47 crc kubenswrapper[5072]: E0228 04:31:47.442056 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a7ac4fd8-0666-458f-8910-26abc36f0bdd-operator-scripts podName:a7ac4fd8-0666-458f-8910-26abc36f0bdd nodeName:}" failed. No retries permitted until 2026-02-28 04:31:48.442038502 +0000 UTC m=+1330.436768694 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a7ac4fd8-0666-458f-8910-26abc36f0bdd-operator-scripts") pod "keystonef9dc-account-delete-9skqx" (UID: "a7ac4fd8-0666-458f-8910-26abc36f0bdd") : configmap "openstack-scripts" not found Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.442000 5072 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.456295 5072 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.543409 5072 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.830304 5072 generic.go:334] "Generic (PLEG): container finished" podID="a7ac4fd8-0666-458f-8910-26abc36f0bdd" containerID="42d36128e08c118929604145ccb16de9ac635efe97f4c9d81c2a7c93fbba0995" exitCode=1 Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.830388 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystonef9dc-account-delete-9skqx" event={"ID":"a7ac4fd8-0666-458f-8910-26abc36f0bdd","Type":"ContainerDied","Data":"42d36128e08c118929604145ccb16de9ac635efe97f4c9d81c2a7c93fbba0995"} Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.830916 5072 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="horizon-kuttl-tests/keystonef9dc-account-delete-9skqx" secret="" err="secret \"galera-openstack-dockercfg-98rq2\" not found" Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.830958 5072 scope.go:117] "RemoveContainer" containerID="42d36128e08c118929604145ccb16de9ac635efe97f4c9d81c2a7c93fbba0995" Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.832791 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/rabbitmq-server-0"] Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.833096 5072 generic.go:334] "Generic (PLEG): container finished" podID="9c25f535-2cfb-40b6-9412-9888a0fc1975" containerID="97553569d9278a79ed1a5d79f9d3aa79d091aabb454a56adbe2614f1c6dc6f9e" exitCode=0 Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.833356 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-2" Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.834517 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-2" event={"ID":"9c25f535-2cfb-40b6-9412-9888a0fc1975","Type":"ContainerDied","Data":"97553569d9278a79ed1a5d79f9d3aa79d091aabb454a56adbe2614f1c6dc6f9e"} Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.834552 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-2" event={"ID":"9c25f535-2cfb-40b6-9412-9888a0fc1975","Type":"ContainerDied","Data":"ce607371b58b0d39e69ca7abef7ff061fd840b7592e4e67caf4f4b8c094d4730"} Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.834574 5072 scope.go:117] "RemoveContainer" containerID="97553569d9278a79ed1a5d79f9d3aa79d091aabb454a56adbe2614f1c6dc6f9e" Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.870071 5072 scope.go:117] "RemoveContainer" containerID="e3c37a111fa54a6762e88d7d2f359136c4f47029dbd2260a0a6dfcf5fcdc8e2f" Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.873362 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/rabbitmq-server-0" podUID="c71c158a-9876-4f8e-9100-7c0a36834415" containerName="rabbitmq" containerID="cri-o://e814f8979cd987bed1ba685abb070637ffd1c54d3b8c5ac3cb6a6a5a450e7185" gracePeriod=604800 Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.876450 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/openstack-galera-2"] Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.886091 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/openstack-galera-2"] Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.898797 5072 scope.go:117] "RemoveContainer" containerID="97553569d9278a79ed1a5d79f9d3aa79d091aabb454a56adbe2614f1c6dc6f9e" Feb 28 04:31:47 crc kubenswrapper[5072]: E0228 04:31:47.899302 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97553569d9278a79ed1a5d79f9d3aa79d091aabb454a56adbe2614f1c6dc6f9e\": container with ID starting with 97553569d9278a79ed1a5d79f9d3aa79d091aabb454a56adbe2614f1c6dc6f9e not found: ID does not exist" containerID="97553569d9278a79ed1a5d79f9d3aa79d091aabb454a56adbe2614f1c6dc6f9e" Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.899347 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97553569d9278a79ed1a5d79f9d3aa79d091aabb454a56adbe2614f1c6dc6f9e"} err="failed to get container status \"97553569d9278a79ed1a5d79f9d3aa79d091aabb454a56adbe2614f1c6dc6f9e\": rpc error: code = NotFound desc = could not find container \"97553569d9278a79ed1a5d79f9d3aa79d091aabb454a56adbe2614f1c6dc6f9e\": container with ID starting with 97553569d9278a79ed1a5d79f9d3aa79d091aabb454a56adbe2614f1c6dc6f9e not found: ID does not exist" Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.899420 5072 scope.go:117] "RemoveContainer" containerID="e3c37a111fa54a6762e88d7d2f359136c4f47029dbd2260a0a6dfcf5fcdc8e2f" Feb 28 04:31:47 crc kubenswrapper[5072]: E0228 04:31:47.901885 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3c37a111fa54a6762e88d7d2f359136c4f47029dbd2260a0a6dfcf5fcdc8e2f\": container with ID starting with e3c37a111fa54a6762e88d7d2f359136c4f47029dbd2260a0a6dfcf5fcdc8e2f not found: ID does not exist" containerID="e3c37a111fa54a6762e88d7d2f359136c4f47029dbd2260a0a6dfcf5fcdc8e2f" Feb 28 04:31:47 crc kubenswrapper[5072]: I0228 04:31:47.901926 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3c37a111fa54a6762e88d7d2f359136c4f47029dbd2260a0a6dfcf5fcdc8e2f"} err="failed to get container status \"e3c37a111fa54a6762e88d7d2f359136c4f47029dbd2260a0a6dfcf5fcdc8e2f\": rpc error: code = NotFound desc = could not find container \"e3c37a111fa54a6762e88d7d2f359136c4f47029dbd2260a0a6dfcf5fcdc8e2f\": container with ID starting with e3c37a111fa54a6762e88d7d2f359136c4f47029dbd2260a0a6dfcf5fcdc8e2f not found: ID does not exist" Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.442096 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/memcached-0" Feb 28 04:31:48 crc kubenswrapper[5072]: E0228 04:31:48.459056 5072 configmap.go:193] Couldn't get configMap horizon-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Feb 28 04:31:48 crc kubenswrapper[5072]: E0228 04:31:48.459152 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a7ac4fd8-0666-458f-8910-26abc36f0bdd-operator-scripts podName:a7ac4fd8-0666-458f-8910-26abc36f0bdd nodeName:}" failed. No retries permitted until 2026-02-28 04:31:50.459123025 +0000 UTC m=+1332.453853217 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a7ac4fd8-0666-458f-8910-26abc36f0bdd-operator-scripts") pod "keystonef9dc-account-delete-9skqx" (UID: "a7ac4fd8-0666-458f-8910-26abc36f0bdd") : configmap "openstack-scripts" not found Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.559770 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skxzv\" (UniqueName: \"kubernetes.io/projected/2da690cd-386a-45cf-89a9-4d5a02218af4-kube-api-access-skxzv\") pod \"2da690cd-386a-45cf-89a9-4d5a02218af4\" (UID: \"2da690cd-386a-45cf-89a9-4d5a02218af4\") " Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.560057 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2da690cd-386a-45cf-89a9-4d5a02218af4-config-data\") pod \"2da690cd-386a-45cf-89a9-4d5a02218af4\" (UID: \"2da690cd-386a-45cf-89a9-4d5a02218af4\") " Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.560105 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2da690cd-386a-45cf-89a9-4d5a02218af4-kolla-config\") pod \"2da690cd-386a-45cf-89a9-4d5a02218af4\" (UID: \"2da690cd-386a-45cf-89a9-4d5a02218af4\") " Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.561356 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2da690cd-386a-45cf-89a9-4d5a02218af4-config-data" (OuterVolumeSpecName: "config-data") pod "2da690cd-386a-45cf-89a9-4d5a02218af4" (UID: "2da690cd-386a-45cf-89a9-4d5a02218af4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.561394 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2da690cd-386a-45cf-89a9-4d5a02218af4-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "2da690cd-386a-45cf-89a9-4d5a02218af4" (UID: "2da690cd-386a-45cf-89a9-4d5a02218af4"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.570271 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2da690cd-386a-45cf-89a9-4d5a02218af4-kube-api-access-skxzv" (OuterVolumeSpecName: "kube-api-access-skxzv") pod "2da690cd-386a-45cf-89a9-4d5a02218af4" (UID: "2da690cd-386a-45cf-89a9-4d5a02218af4"). InnerVolumeSpecName "kube-api-access-skxzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.613811 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/openstack-galera-1" podUID="8171cc83-a178-4d19-b1c5-0d93b123838c" containerName="galera" containerID="cri-o://866c9368dc2e231135ded0ed584c6786cba66d438faaeff2f1a3fc3e0eb68a2b" gracePeriod=28 Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.622503 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6c7c8d5cfd-hjt58"] Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.622785 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/horizon-operator-controller-manager-6c7c8d5cfd-hjt58" podUID="ba36008b-f798-4c99-bb4a-684f98897de8" containerName="manager" containerID="cri-o://ee773408be9a18b6a91dab47e9145ee0ac6cb6b2e5f91475156fb5be304a679f" gracePeriod=10 Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.661425 5072 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2da690cd-386a-45cf-89a9-4d5a02218af4-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.661470 5072 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2da690cd-386a-45cf-89a9-4d5a02218af4-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.661485 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skxzv\" (UniqueName: \"kubernetes.io/projected/2da690cd-386a-45cf-89a9-4d5a02218af4-kube-api-access-skxzv\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.673904 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-6c7c8d5cfd-hjt58" podUID="ba36008b-f798-4c99-bb4a-684f98897de8" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.86:8081/readyz\": dial tcp 10.217.0.86:8081: connect: connection refused" Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.674365 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c25f535-2cfb-40b6-9412-9888a0fc1975" path="/var/lib/kubelet/pods/9c25f535-2cfb-40b6-9412-9888a0fc1975/volumes" Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.847405 5072 generic.go:334] "Generic (PLEG): container finished" podID="fab191f5-56a2-4b06-88be-14286e763b52" containerID="d9899680691ab4bc6ab96c216a2101cf16f665914dcec5c175418bc070483704" exitCode=0 Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.847480 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-577bcf6dcc-r9srn" event={"ID":"fab191f5-56a2-4b06-88be-14286e763b52","Type":"ContainerDied","Data":"d9899680691ab4bc6ab96c216a2101cf16f665914dcec5c175418bc070483704"} Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.858854 5072 generic.go:334] "Generic (PLEG): container finished" podID="ba36008b-f798-4c99-bb4a-684f98897de8" containerID="ee773408be9a18b6a91dab47e9145ee0ac6cb6b2e5f91475156fb5be304a679f" exitCode=0 Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.858915 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6c7c8d5cfd-hjt58" event={"ID":"ba36008b-f798-4c99-bb4a-684f98897de8","Type":"ContainerDied","Data":"ee773408be9a18b6a91dab47e9145ee0ac6cb6b2e5f91475156fb5be304a679f"} Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.872053 5072 generic.go:334] "Generic (PLEG): container finished" podID="2da690cd-386a-45cf-89a9-4d5a02218af4" containerID="bf5ad711c4005244241e7c346d2c1a7ef5e3506d23aa2bcbcf1e62402a2e81f5" exitCode=0 Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.872124 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/memcached-0" event={"ID":"2da690cd-386a-45cf-89a9-4d5a02218af4","Type":"ContainerDied","Data":"bf5ad711c4005244241e7c346d2c1a7ef5e3506d23aa2bcbcf1e62402a2e81f5"} Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.872154 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/memcached-0" event={"ID":"2da690cd-386a-45cf-89a9-4d5a02218af4","Type":"ContainerDied","Data":"61edea0b66d9fcfa1633804127dd188965b4668825cae119e984bf6fb14b00a3"} Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.872173 5072 scope.go:117] "RemoveContainer" containerID="bf5ad711c4005244241e7c346d2c1a7ef5e3506d23aa2bcbcf1e62402a2e81f5" Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.872282 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/memcached-0" Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.899599 5072 generic.go:334] "Generic (PLEG): container finished" podID="a7ac4fd8-0666-458f-8910-26abc36f0bdd" containerID="5daf3fa507a64e689873e84ff4000129257acc16dbd21ecf24dfe01da2c51df4" exitCode=1 Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.899950 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystonef9dc-account-delete-9skqx" event={"ID":"a7ac4fd8-0666-458f-8910-26abc36f0bdd","Type":"ContainerDied","Data":"5daf3fa507a64e689873e84ff4000129257acc16dbd21ecf24dfe01da2c51df4"} Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.900453 5072 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="horizon-kuttl-tests/keystonef9dc-account-delete-9skqx" secret="" err="secret \"galera-openstack-dockercfg-98rq2\" not found" Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.900485 5072 scope.go:117] "RemoveContainer" containerID="5daf3fa507a64e689873e84ff4000129257acc16dbd21ecf24dfe01da2c51df4" Feb 28 04:31:48 crc kubenswrapper[5072]: E0228 04:31:48.902862 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-delete\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-delete pod=keystonef9dc-account-delete-9skqx_horizon-kuttl-tests(a7ac4fd8-0666-458f-8910-26abc36f0bdd)\"" pod="horizon-kuttl-tests/keystonef9dc-account-delete-9skqx" podUID="a7ac4fd8-0666-458f-8910-26abc36f0bdd" Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.904709 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/memcached-0"] Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.932777 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/memcached-0"] Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.961297 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/horizon-operator-index-4gztb"] Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.961545 5072 scope.go:117] "RemoveContainer" containerID="bf5ad711c4005244241e7c346d2c1a7ef5e3506d23aa2bcbcf1e62402a2e81f5" Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.961570 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/horizon-operator-index-4gztb" podUID="b9ce90ac-7ea8-44b6-bfae-05f51789c804" containerName="registry-server" containerID="cri-o://d55011f7778e805e39e6353c6c5487e6eca17d1d876a7131971370a5925368e8" gracePeriod=30 Feb 28 04:31:48 crc kubenswrapper[5072]: E0228 04:31:48.964844 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf5ad711c4005244241e7c346d2c1a7ef5e3506d23aa2bcbcf1e62402a2e81f5\": container with ID starting with bf5ad711c4005244241e7c346d2c1a7ef5e3506d23aa2bcbcf1e62402a2e81f5 not found: ID does not exist" containerID="bf5ad711c4005244241e7c346d2c1a7ef5e3506d23aa2bcbcf1e62402a2e81f5" Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.964919 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf5ad711c4005244241e7c346d2c1a7ef5e3506d23aa2bcbcf1e62402a2e81f5"} err="failed to get container status \"bf5ad711c4005244241e7c346d2c1a7ef5e3506d23aa2bcbcf1e62402a2e81f5\": rpc error: code = NotFound desc = could not find container \"bf5ad711c4005244241e7c346d2c1a7ef5e3506d23aa2bcbcf1e62402a2e81f5\": container with ID starting with bf5ad711c4005244241e7c346d2c1a7ef5e3506d23aa2bcbcf1e62402a2e81f5 not found: ID does not exist" Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.964956 5072 scope.go:117] "RemoveContainer" containerID="42d36128e08c118929604145ccb16de9ac635efe97f4c9d81c2a7c93fbba0995" Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.985727 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl"] Feb 28 04:31:48 crc kubenswrapper[5072]: I0228 04:31:48.989751 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/627885af177bada86ea6a5b663ddc83090b7ca010623894380c2e69a9dfzkrl"] Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.040431 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/rabbitmq-server-0" podUID="c71c158a-9876-4f8e-9100-7c0a36834415" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.66:5672: connect: connection refused" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.142845 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-577bcf6dcc-r9srn" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.218035 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6c7c8d5cfd-hjt58" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.275053 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ba36008b-f798-4c99-bb4a-684f98897de8-apiservice-cert\") pod \"ba36008b-f798-4c99-bb4a-684f98897de8\" (UID: \"ba36008b-f798-4c99-bb4a-684f98897de8\") " Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.275118 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fab191f5-56a2-4b06-88be-14286e763b52-credential-keys\") pod \"fab191f5-56a2-4b06-88be-14286e763b52\" (UID: \"fab191f5-56a2-4b06-88be-14286e763b52\") " Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.275141 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fab191f5-56a2-4b06-88be-14286e763b52-config-data\") pod \"fab191f5-56a2-4b06-88be-14286e763b52\" (UID: \"fab191f5-56a2-4b06-88be-14286e763b52\") " Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.275163 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fab191f5-56a2-4b06-88be-14286e763b52-scripts\") pod \"fab191f5-56a2-4b06-88be-14286e763b52\" (UID: \"fab191f5-56a2-4b06-88be-14286e763b52\") " Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.275235 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fab191f5-56a2-4b06-88be-14286e763b52-fernet-keys\") pod \"fab191f5-56a2-4b06-88be-14286e763b52\" (UID: \"fab191f5-56a2-4b06-88be-14286e763b52\") " Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.275279 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdhh2\" (UniqueName: \"kubernetes.io/projected/ba36008b-f798-4c99-bb4a-684f98897de8-kube-api-access-sdhh2\") pod \"ba36008b-f798-4c99-bb4a-684f98897de8\" (UID: \"ba36008b-f798-4c99-bb4a-684f98897de8\") " Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.275300 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7c9m\" (UniqueName: \"kubernetes.io/projected/fab191f5-56a2-4b06-88be-14286e763b52-kube-api-access-v7c9m\") pod \"fab191f5-56a2-4b06-88be-14286e763b52\" (UID: \"fab191f5-56a2-4b06-88be-14286e763b52\") " Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.275326 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ba36008b-f798-4c99-bb4a-684f98897de8-webhook-cert\") pod \"ba36008b-f798-4c99-bb4a-684f98897de8\" (UID: \"ba36008b-f798-4c99-bb4a-684f98897de8\") " Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.282125 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fab191f5-56a2-4b06-88be-14286e763b52-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "fab191f5-56a2-4b06-88be-14286e763b52" (UID: "fab191f5-56a2-4b06-88be-14286e763b52"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.282505 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba36008b-f798-4c99-bb4a-684f98897de8-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "ba36008b-f798-4c99-bb4a-684f98897de8" (UID: "ba36008b-f798-4c99-bb4a-684f98897de8"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.283035 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fab191f5-56a2-4b06-88be-14286e763b52-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "fab191f5-56a2-4b06-88be-14286e763b52" (UID: "fab191f5-56a2-4b06-88be-14286e763b52"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.283374 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba36008b-f798-4c99-bb4a-684f98897de8-kube-api-access-sdhh2" (OuterVolumeSpecName: "kube-api-access-sdhh2") pod "ba36008b-f798-4c99-bb4a-684f98897de8" (UID: "ba36008b-f798-4c99-bb4a-684f98897de8"). InnerVolumeSpecName "kube-api-access-sdhh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.283995 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fab191f5-56a2-4b06-88be-14286e763b52-kube-api-access-v7c9m" (OuterVolumeSpecName: "kube-api-access-v7c9m") pod "fab191f5-56a2-4b06-88be-14286e763b52" (UID: "fab191f5-56a2-4b06-88be-14286e763b52"). InnerVolumeSpecName "kube-api-access-v7c9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.284414 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba36008b-f798-4c99-bb4a-684f98897de8-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "ba36008b-f798-4c99-bb4a-684f98897de8" (UID: "ba36008b-f798-4c99-bb4a-684f98897de8"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.290580 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fab191f5-56a2-4b06-88be-14286e763b52-scripts" (OuterVolumeSpecName: "scripts") pod "fab191f5-56a2-4b06-88be-14286e763b52" (UID: "fab191f5-56a2-4b06-88be-14286e763b52"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.310823 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fab191f5-56a2-4b06-88be-14286e763b52-config-data" (OuterVolumeSpecName: "config-data") pod "fab191f5-56a2-4b06-88be-14286e763b52" (UID: "fab191f5-56a2-4b06-88be-14286e763b52"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.335856 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-4gztb" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.376362 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4fp6\" (UniqueName: \"kubernetes.io/projected/b9ce90ac-7ea8-44b6-bfae-05f51789c804-kube-api-access-q4fp6\") pod \"b9ce90ac-7ea8-44b6-bfae-05f51789c804\" (UID: \"b9ce90ac-7ea8-44b6-bfae-05f51789c804\") " Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.376613 5072 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fab191f5-56a2-4b06-88be-14286e763b52-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.376627 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdhh2\" (UniqueName: \"kubernetes.io/projected/ba36008b-f798-4c99-bb4a-684f98897de8-kube-api-access-sdhh2\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.376640 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7c9m\" (UniqueName: \"kubernetes.io/projected/fab191f5-56a2-4b06-88be-14286e763b52-kube-api-access-v7c9m\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.376662 5072 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ba36008b-f798-4c99-bb4a-684f98897de8-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.376673 5072 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ba36008b-f798-4c99-bb4a-684f98897de8-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.376682 5072 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fab191f5-56a2-4b06-88be-14286e763b52-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.376690 5072 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fab191f5-56a2-4b06-88be-14286e763b52-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.376698 5072 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fab191f5-56a2-4b06-88be-14286e763b52-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.380288 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9ce90ac-7ea8-44b6-bfae-05f51789c804-kube-api-access-q4fp6" (OuterVolumeSpecName: "kube-api-access-q4fp6") pod "b9ce90ac-7ea8-44b6-bfae-05f51789c804" (UID: "b9ce90ac-7ea8-44b6-bfae-05f51789c804"). InnerVolumeSpecName "kube-api-access-q4fp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.408468 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/rabbitmq-server-0" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.478015 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bdfb\" (UniqueName: \"kubernetes.io/projected/c71c158a-9876-4f8e-9100-7c0a36834415-kube-api-access-8bdfb\") pod \"c71c158a-9876-4f8e-9100-7c0a36834415\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.478467 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72915375-6889-4787-a67e-5a149afe4680\") pod \"c71c158a-9876-4f8e-9100-7c0a36834415\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.478506 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c71c158a-9876-4f8e-9100-7c0a36834415-pod-info\") pod \"c71c158a-9876-4f8e-9100-7c0a36834415\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.478582 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c71c158a-9876-4f8e-9100-7c0a36834415-erlang-cookie-secret\") pod \"c71c158a-9876-4f8e-9100-7c0a36834415\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.478607 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c71c158a-9876-4f8e-9100-7c0a36834415-rabbitmq-confd\") pod \"c71c158a-9876-4f8e-9100-7c0a36834415\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.478628 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c71c158a-9876-4f8e-9100-7c0a36834415-plugins-conf\") pod \"c71c158a-9876-4f8e-9100-7c0a36834415\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.478666 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c71c158a-9876-4f8e-9100-7c0a36834415-rabbitmq-erlang-cookie\") pod \"c71c158a-9876-4f8e-9100-7c0a36834415\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.478690 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c71c158a-9876-4f8e-9100-7c0a36834415-rabbitmq-plugins\") pod \"c71c158a-9876-4f8e-9100-7c0a36834415\" (UID: \"c71c158a-9876-4f8e-9100-7c0a36834415\") " Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.479318 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4fp6\" (UniqueName: \"kubernetes.io/projected/b9ce90ac-7ea8-44b6-bfae-05f51789c804-kube-api-access-q4fp6\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.479683 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c71c158a-9876-4f8e-9100-7c0a36834415-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c71c158a-9876-4f8e-9100-7c0a36834415" (UID: "c71c158a-9876-4f8e-9100-7c0a36834415"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.481405 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c71c158a-9876-4f8e-9100-7c0a36834415-kube-api-access-8bdfb" (OuterVolumeSpecName: "kube-api-access-8bdfb") pod "c71c158a-9876-4f8e-9100-7c0a36834415" (UID: "c71c158a-9876-4f8e-9100-7c0a36834415"). InnerVolumeSpecName "kube-api-access-8bdfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.481899 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c71c158a-9876-4f8e-9100-7c0a36834415-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c71c158a-9876-4f8e-9100-7c0a36834415" (UID: "c71c158a-9876-4f8e-9100-7c0a36834415"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.482144 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c71c158a-9876-4f8e-9100-7c0a36834415-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c71c158a-9876-4f8e-9100-7c0a36834415" (UID: "c71c158a-9876-4f8e-9100-7c0a36834415"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.482250 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c71c158a-9876-4f8e-9100-7c0a36834415-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c71c158a-9876-4f8e-9100-7c0a36834415" (UID: "c71c158a-9876-4f8e-9100-7c0a36834415"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.483125 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c71c158a-9876-4f8e-9100-7c0a36834415-pod-info" (OuterVolumeSpecName: "pod-info") pod "c71c158a-9876-4f8e-9100-7c0a36834415" (UID: "c71c158a-9876-4f8e-9100-7c0a36834415"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.488851 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72915375-6889-4787-a67e-5a149afe4680" (OuterVolumeSpecName: "persistence") pod "c71c158a-9876-4f8e-9100-7c0a36834415" (UID: "c71c158a-9876-4f8e-9100-7c0a36834415"). InnerVolumeSpecName "pvc-72915375-6889-4787-a67e-5a149afe4680". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.536008 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c71c158a-9876-4f8e-9100-7c0a36834415-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c71c158a-9876-4f8e-9100-7c0a36834415" (UID: "c71c158a-9876-4f8e-9100-7c0a36834415"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.580423 5072 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c71c158a-9876-4f8e-9100-7c0a36834415-pod-info\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.580465 5072 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c71c158a-9876-4f8e-9100-7c0a36834415-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.580481 5072 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c71c158a-9876-4f8e-9100-7c0a36834415-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.580492 5072 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c71c158a-9876-4f8e-9100-7c0a36834415-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.580505 5072 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c71c158a-9876-4f8e-9100-7c0a36834415-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.580517 5072 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c71c158a-9876-4f8e-9100-7c0a36834415-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.580530 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bdfb\" (UniqueName: \"kubernetes.io/projected/c71c158a-9876-4f8e-9100-7c0a36834415-kube-api-access-8bdfb\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.580573 5072 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-72915375-6889-4787-a67e-5a149afe4680\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72915375-6889-4787-a67e-5a149afe4680\") on node \"crc\" " Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.596570 5072 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.596795 5072 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-72915375-6889-4787-a67e-5a149afe4680" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72915375-6889-4787-a67e-5a149afe4680") on node "crc" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.681529 5072 reconciler_common.go:293] "Volume detached for volume \"pvc-72915375-6889-4787-a67e-5a149afe4680\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72915375-6889-4787-a67e-5a149afe4680\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.910885 5072 generic.go:334] "Generic (PLEG): container finished" podID="b9ce90ac-7ea8-44b6-bfae-05f51789c804" containerID="d55011f7778e805e39e6353c6c5487e6eca17d1d876a7131971370a5925368e8" exitCode=0 Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.910922 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-4gztb" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.910963 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-4gztb" event={"ID":"b9ce90ac-7ea8-44b6-bfae-05f51789c804","Type":"ContainerDied","Data":"d55011f7778e805e39e6353c6c5487e6eca17d1d876a7131971370a5925368e8"} Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.911029 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-4gztb" event={"ID":"b9ce90ac-7ea8-44b6-bfae-05f51789c804","Type":"ContainerDied","Data":"ec5d3a3d26432bfb5e010bbf5e39a59654ec9329225572915915c16db19fe0ae"} Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.911053 5072 scope.go:117] "RemoveContainer" containerID="d55011f7778e805e39e6353c6c5487e6eca17d1d876a7131971370a5925368e8" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.914765 5072 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="horizon-kuttl-tests/keystonef9dc-account-delete-9skqx" secret="" err="secret \"galera-openstack-dockercfg-98rq2\" not found" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.914828 5072 scope.go:117] "RemoveContainer" containerID="5daf3fa507a64e689873e84ff4000129257acc16dbd21ecf24dfe01da2c51df4" Feb 28 04:31:49 crc kubenswrapper[5072]: E0228 04:31:49.915149 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-delete\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-delete pod=keystonef9dc-account-delete-9skqx_horizon-kuttl-tests(a7ac4fd8-0666-458f-8910-26abc36f0bdd)\"" pod="horizon-kuttl-tests/keystonef9dc-account-delete-9skqx" podUID="a7ac4fd8-0666-458f-8910-26abc36f0bdd" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.915752 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-577bcf6dcc-r9srn" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.915807 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-577bcf6dcc-r9srn" event={"ID":"fab191f5-56a2-4b06-88be-14286e763b52","Type":"ContainerDied","Data":"6e2686e56a3e7d735e0b953e61784ab5586e46342f298e2d62b0bfb93f7da99b"} Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.919598 5072 generic.go:334] "Generic (PLEG): container finished" podID="c71c158a-9876-4f8e-9100-7c0a36834415" containerID="e814f8979cd987bed1ba685abb070637ffd1c54d3b8c5ac3cb6a6a5a450e7185" exitCode=0 Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.919668 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/rabbitmq-server-0" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.919691 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/rabbitmq-server-0" event={"ID":"c71c158a-9876-4f8e-9100-7c0a36834415","Type":"ContainerDied","Data":"e814f8979cd987bed1ba685abb070637ffd1c54d3b8c5ac3cb6a6a5a450e7185"} Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.919717 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/rabbitmq-server-0" event={"ID":"c71c158a-9876-4f8e-9100-7c0a36834415","Type":"ContainerDied","Data":"269042d0c382cdfb8d4cf6f67f41cc281a02225c549660ad5d64100ede0972ef"} Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.923602 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6c7c8d5cfd-hjt58" event={"ID":"ba36008b-f798-4c99-bb4a-684f98897de8","Type":"ContainerDied","Data":"0be814b6a25b4953055eafc33d110d0f5809d2b0666a2a80939498a563f18f41"} Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.923667 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6c7c8d5cfd-hjt58" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.931763 5072 scope.go:117] "RemoveContainer" containerID="d55011f7778e805e39e6353c6c5487e6eca17d1d876a7131971370a5925368e8" Feb 28 04:31:49 crc kubenswrapper[5072]: E0228 04:31:49.932384 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d55011f7778e805e39e6353c6c5487e6eca17d1d876a7131971370a5925368e8\": container with ID starting with d55011f7778e805e39e6353c6c5487e6eca17d1d876a7131971370a5925368e8 not found: ID does not exist" containerID="d55011f7778e805e39e6353c6c5487e6eca17d1d876a7131971370a5925368e8" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.932444 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d55011f7778e805e39e6353c6c5487e6eca17d1d876a7131971370a5925368e8"} err="failed to get container status \"d55011f7778e805e39e6353c6c5487e6eca17d1d876a7131971370a5925368e8\": rpc error: code = NotFound desc = could not find container \"d55011f7778e805e39e6353c6c5487e6eca17d1d876a7131971370a5925368e8\": container with ID starting with d55011f7778e805e39e6353c6c5487e6eca17d1d876a7131971370a5925368e8 not found: ID does not exist" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.932479 5072 scope.go:117] "RemoveContainer" containerID="d9899680691ab4bc6ab96c216a2101cf16f665914dcec5c175418bc070483704" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.968861 5072 scope.go:117] "RemoveContainer" containerID="e814f8979cd987bed1ba685abb070637ffd1c54d3b8c5ac3cb6a6a5a450e7185" Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.982872 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/keystone-577bcf6dcc-r9srn"] Feb 28 04:31:49 crc kubenswrapper[5072]: I0228 04:31:49.994870 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/keystone-577bcf6dcc-r9srn"] Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.002101 5072 scope.go:117] "RemoveContainer" containerID="25cd65eea13a4dec7c45316f939fe6c7a1ad58c61a487560531199a80555628e" Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.003970 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/horizon-operator-index-4gztb"] Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.009548 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/horizon-operator-index-4gztb"] Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.015126 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/rabbitmq-server-0"] Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.025735 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/rabbitmq-server-0"] Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.026122 5072 scope.go:117] "RemoveContainer" containerID="e814f8979cd987bed1ba685abb070637ffd1c54d3b8c5ac3cb6a6a5a450e7185" Feb 28 04:31:50 crc kubenswrapper[5072]: E0228 04:31:50.026471 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e814f8979cd987bed1ba685abb070637ffd1c54d3b8c5ac3cb6a6a5a450e7185\": container with ID starting with e814f8979cd987bed1ba685abb070637ffd1c54d3b8c5ac3cb6a6a5a450e7185 not found: ID does not exist" containerID="e814f8979cd987bed1ba685abb070637ffd1c54d3b8c5ac3cb6a6a5a450e7185" Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.026509 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e814f8979cd987bed1ba685abb070637ffd1c54d3b8c5ac3cb6a6a5a450e7185"} err="failed to get container status \"e814f8979cd987bed1ba685abb070637ffd1c54d3b8c5ac3cb6a6a5a450e7185\": rpc error: code = NotFound desc = could not find container \"e814f8979cd987bed1ba685abb070637ffd1c54d3b8c5ac3cb6a6a5a450e7185\": container with ID starting with e814f8979cd987bed1ba685abb070637ffd1c54d3b8c5ac3cb6a6a5a450e7185 not found: ID does not exist" Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.026534 5072 scope.go:117] "RemoveContainer" containerID="25cd65eea13a4dec7c45316f939fe6c7a1ad58c61a487560531199a80555628e" Feb 28 04:31:50 crc kubenswrapper[5072]: E0228 04:31:50.026875 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25cd65eea13a4dec7c45316f939fe6c7a1ad58c61a487560531199a80555628e\": container with ID starting with 25cd65eea13a4dec7c45316f939fe6c7a1ad58c61a487560531199a80555628e not found: ID does not exist" containerID="25cd65eea13a4dec7c45316f939fe6c7a1ad58c61a487560531199a80555628e" Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.026901 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25cd65eea13a4dec7c45316f939fe6c7a1ad58c61a487560531199a80555628e"} err="failed to get container status \"25cd65eea13a4dec7c45316f939fe6c7a1ad58c61a487560531199a80555628e\": rpc error: code = NotFound desc = could not find container \"25cd65eea13a4dec7c45316f939fe6c7a1ad58c61a487560531199a80555628e\": container with ID starting with 25cd65eea13a4dec7c45316f939fe6c7a1ad58c61a487560531199a80555628e not found: ID does not exist" Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.026916 5072 scope.go:117] "RemoveContainer" containerID="ee773408be9a18b6a91dab47e9145ee0ac6cb6b2e5f91475156fb5be304a679f" Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.030662 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6c7c8d5cfd-hjt58"] Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.042600 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6c7c8d5cfd-hjt58"] Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.473402 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/keystone-db-create-lkm6s"] Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.479723 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/keystone-db-create-lkm6s"] Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.488466 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/keystone-f9dc-account-create-update-b59fq"] Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.492054 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/keystonef9dc-account-delete-9skqx"] Feb 28 04:31:50 crc kubenswrapper[5072]: E0228 04:31:50.495501 5072 configmap.go:193] Couldn't get configMap horizon-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Feb 28 04:31:50 crc kubenswrapper[5072]: E0228 04:31:50.495578 5072 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a7ac4fd8-0666-458f-8910-26abc36f0bdd-operator-scripts podName:a7ac4fd8-0666-458f-8910-26abc36f0bdd nodeName:}" failed. No retries permitted until 2026-02-28 04:31:54.495560173 +0000 UTC m=+1336.490290365 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a7ac4fd8-0666-458f-8910-26abc36f0bdd-operator-scripts") pod "keystonef9dc-account-delete-9skqx" (UID: "a7ac4fd8-0666-458f-8910-26abc36f0bdd") : configmap "openstack-scripts" not found Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.495806 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/keystone-f9dc-account-create-update-b59fq"] Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.548717 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-1" Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.582743 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/openstack-galera-0" podUID="e56491ab-6d17-4127-a25b-75b5e900e0aa" containerName="galera" containerID="cri-o://22923bc7fb6b00b2c45a6c2fe17bc3e8e1d8c9956b429372dfce46aca53cb8f7" gracePeriod=26 Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.596077 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phj5n\" (UniqueName: \"kubernetes.io/projected/8171cc83-a178-4d19-b1c5-0d93b123838c-kube-api-access-phj5n\") pod \"8171cc83-a178-4d19-b1c5-0d93b123838c\" (UID: \"8171cc83-a178-4d19-b1c5-0d93b123838c\") " Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.596154 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8171cc83-a178-4d19-b1c5-0d93b123838c-config-data-default\") pod \"8171cc83-a178-4d19-b1c5-0d93b123838c\" (UID: \"8171cc83-a178-4d19-b1c5-0d93b123838c\") " Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.596193 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8171cc83-a178-4d19-b1c5-0d93b123838c-config-data-generated\") pod \"8171cc83-a178-4d19-b1c5-0d93b123838c\" (UID: \"8171cc83-a178-4d19-b1c5-0d93b123838c\") " Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.596263 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8171cc83-a178-4d19-b1c5-0d93b123838c-operator-scripts\") pod \"8171cc83-a178-4d19-b1c5-0d93b123838c\" (UID: \"8171cc83-a178-4d19-b1c5-0d93b123838c\") " Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.596887 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8171cc83-a178-4d19-b1c5-0d93b123838c-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "8171cc83-a178-4d19-b1c5-0d93b123838c" (UID: "8171cc83-a178-4d19-b1c5-0d93b123838c"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.596946 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8171cc83-a178-4d19-b1c5-0d93b123838c-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "8171cc83-a178-4d19-b1c5-0d93b123838c" (UID: "8171cc83-a178-4d19-b1c5-0d93b123838c"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.597144 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8171cc83-a178-4d19-b1c5-0d93b123838c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8171cc83-a178-4d19-b1c5-0d93b123838c" (UID: "8171cc83-a178-4d19-b1c5-0d93b123838c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.597229 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8171cc83-a178-4d19-b1c5-0d93b123838c-kolla-config\") pod \"8171cc83-a178-4d19-b1c5-0d93b123838c\" (UID: \"8171cc83-a178-4d19-b1c5-0d93b123838c\") " Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.597311 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"8171cc83-a178-4d19-b1c5-0d93b123838c\" (UID: \"8171cc83-a178-4d19-b1c5-0d93b123838c\") " Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.597629 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8171cc83-a178-4d19-b1c5-0d93b123838c-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "8171cc83-a178-4d19-b1c5-0d93b123838c" (UID: "8171cc83-a178-4d19-b1c5-0d93b123838c"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.598171 5072 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8171cc83-a178-4d19-b1c5-0d93b123838c-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.598196 5072 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8171cc83-a178-4d19-b1c5-0d93b123838c-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.598210 5072 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8171cc83-a178-4d19-b1c5-0d93b123838c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.598222 5072 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8171cc83-a178-4d19-b1c5-0d93b123838c-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.612447 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8171cc83-a178-4d19-b1c5-0d93b123838c-kube-api-access-phj5n" (OuterVolumeSpecName: "kube-api-access-phj5n") pod "8171cc83-a178-4d19-b1c5-0d93b123838c" (UID: "8171cc83-a178-4d19-b1c5-0d93b123838c"). InnerVolumeSpecName "kube-api-access-phj5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.627167 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "mysql-db") pod "8171cc83-a178-4d19-b1c5-0d93b123838c" (UID: "8171cc83-a178-4d19-b1c5-0d93b123838c"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.632808 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-784c7fcf4f-fvf4q"] Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.633015 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-controller-manager-784c7fcf4f-fvf4q" podUID="6325e48f-129d-4832-99d8-1cd8088708c3" containerName="manager" containerID="cri-o://9257be1e822cb1d78e5876a482a752924d94b6f97d2c75b12e040888675fc056" gracePeriod=10 Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.671392 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b9b43ee-d217-4d73-8029-176c01146473" path="/var/lib/kubelet/pods/2b9b43ee-d217-4d73-8029-176c01146473/volumes" Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.672175 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2da690cd-386a-45cf-89a9-4d5a02218af4" path="/var/lib/kubelet/pods/2da690cd-386a-45cf-89a9-4d5a02218af4/volumes" Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.672625 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e18fcea-e598-4728-9708-b423c2f5686b" path="/var/lib/kubelet/pods/8e18fcea-e598-4728-9708-b423c2f5686b/volumes" Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.673470 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9ce90ac-7ea8-44b6-bfae-05f51789c804" path="/var/lib/kubelet/pods/b9ce90ac-7ea8-44b6-bfae-05f51789c804/volumes" Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.673913 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba36008b-f798-4c99-bb4a-684f98897de8" path="/var/lib/kubelet/pods/ba36008b-f798-4c99-bb4a-684f98897de8/volumes" Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.674479 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c71c158a-9876-4f8e-9100-7c0a36834415" path="/var/lib/kubelet/pods/c71c158a-9876-4f8e-9100-7c0a36834415/volumes" Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.675397 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df9b3873-ce74-47e3-a875-d07950e69125" path="/var/lib/kubelet/pods/df9b3873-ce74-47e3-a875-d07950e69125/volumes" Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.675845 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fab191f5-56a2-4b06-88be-14286e763b52" path="/var/lib/kubelet/pods/fab191f5-56a2-4b06-88be-14286e763b52/volumes" Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.699834 5072 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.699883 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phj5n\" (UniqueName: \"kubernetes.io/projected/8171cc83-a178-4d19-b1c5-0d93b123838c-kube-api-access-phj5n\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.710890 5072 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.800946 5072 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:50 crc kubenswrapper[5072]: E0228 04:31:50.858495 5072 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="22923bc7fb6b00b2c45a6c2fe17bc3e8e1d8c9956b429372dfce46aca53cb8f7" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Feb 28 04:31:50 crc kubenswrapper[5072]: E0228 04:31:50.860228 5072 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="22923bc7fb6b00b2c45a6c2fe17bc3e8e1d8c9956b429372dfce46aca53cb8f7" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Feb 28 04:31:50 crc kubenswrapper[5072]: E0228 04:31:50.869053 5072 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="22923bc7fb6b00b2c45a6c2fe17bc3e8e1d8c9956b429372dfce46aca53cb8f7" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Feb 28 04:31:50 crc kubenswrapper[5072]: E0228 04:31:50.869137 5072 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="horizon-kuttl-tests/openstack-galera-0" podUID="e56491ab-6d17-4127-a25b-75b5e900e0aa" containerName="galera" Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.916735 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-crv8s"] Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.917008 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-index-crv8s" podUID="f7ec1561-2733-469f-b4b4-13035f2557f0" containerName="registry-server" containerID="cri-o://a24df8c3e94264b53b151a7d0b3dc470fd182808407e5bd0c999d4733247cd81" gracePeriod=30 Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.939001 5072 generic.go:334] "Generic (PLEG): container finished" podID="6325e48f-129d-4832-99d8-1cd8088708c3" containerID="9257be1e822cb1d78e5876a482a752924d94b6f97d2c75b12e040888675fc056" exitCode=0 Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.939097 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-784c7fcf4f-fvf4q" event={"ID":"6325e48f-129d-4832-99d8-1cd8088708c3","Type":"ContainerDied","Data":"9257be1e822cb1d78e5876a482a752924d94b6f97d2c75b12e040888675fc056"} Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.948167 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d"] Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.949818 5072 generic.go:334] "Generic (PLEG): container finished" podID="8171cc83-a178-4d19-b1c5-0d93b123838c" containerID="866c9368dc2e231135ded0ed584c6786cba66d438faaeff2f1a3fc3e0eb68a2b" exitCode=0 Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.950275 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-1" Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.950333 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-1" event={"ID":"8171cc83-a178-4d19-b1c5-0d93b123838c","Type":"ContainerDied","Data":"866c9368dc2e231135ded0ed584c6786cba66d438faaeff2f1a3fc3e0eb68a2b"} Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.950377 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-1" event={"ID":"8171cc83-a178-4d19-b1c5-0d93b123838c","Type":"ContainerDied","Data":"27007c39a2131d9b8e6c4702a297dd6d1e6d6c5632bd74698ac8f31dbc6f51e1"} Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.950393 5072 scope.go:117] "RemoveContainer" containerID="866c9368dc2e231135ded0ed584c6786cba66d438faaeff2f1a3fc3e0eb68a2b" Feb 28 04:31:50 crc kubenswrapper[5072]: I0228 04:31:50.962408 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a58nqb5d"] Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.123729 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/openstack-galera-1"] Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.127425 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/openstack-galera-1"] Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.141534 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-784c7fcf4f-fvf4q" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.151214 5072 scope.go:117] "RemoveContainer" containerID="13db49bfc01e9ad1500509adf9d8d82fce556e0f4800f95044b7df4fcc6f98fb" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.184227 5072 scope.go:117] "RemoveContainer" containerID="866c9368dc2e231135ded0ed584c6786cba66d438faaeff2f1a3fc3e0eb68a2b" Feb 28 04:31:51 crc kubenswrapper[5072]: E0228 04:31:51.188093 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"866c9368dc2e231135ded0ed584c6786cba66d438faaeff2f1a3fc3e0eb68a2b\": container with ID starting with 866c9368dc2e231135ded0ed584c6786cba66d438faaeff2f1a3fc3e0eb68a2b not found: ID does not exist" containerID="866c9368dc2e231135ded0ed584c6786cba66d438faaeff2f1a3fc3e0eb68a2b" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.188129 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"866c9368dc2e231135ded0ed584c6786cba66d438faaeff2f1a3fc3e0eb68a2b"} err="failed to get container status \"866c9368dc2e231135ded0ed584c6786cba66d438faaeff2f1a3fc3e0eb68a2b\": rpc error: code = NotFound desc = could not find container \"866c9368dc2e231135ded0ed584c6786cba66d438faaeff2f1a3fc3e0eb68a2b\": container with ID starting with 866c9368dc2e231135ded0ed584c6786cba66d438faaeff2f1a3fc3e0eb68a2b not found: ID does not exist" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.188150 5072 scope.go:117] "RemoveContainer" containerID="13db49bfc01e9ad1500509adf9d8d82fce556e0f4800f95044b7df4fcc6f98fb" Feb 28 04:31:51 crc kubenswrapper[5072]: E0228 04:31:51.188406 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13db49bfc01e9ad1500509adf9d8d82fce556e0f4800f95044b7df4fcc6f98fb\": container with ID starting with 13db49bfc01e9ad1500509adf9d8d82fce556e0f4800f95044b7df4fcc6f98fb not found: ID does not exist" containerID="13db49bfc01e9ad1500509adf9d8d82fce556e0f4800f95044b7df4fcc6f98fb" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.188423 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13db49bfc01e9ad1500509adf9d8d82fce556e0f4800f95044b7df4fcc6f98fb"} err="failed to get container status \"13db49bfc01e9ad1500509adf9d8d82fce556e0f4800f95044b7df4fcc6f98fb\": rpc error: code = NotFound desc = could not find container \"13db49bfc01e9ad1500509adf9d8d82fce556e0f4800f95044b7df4fcc6f98fb\": container with ID starting with 13db49bfc01e9ad1500509adf9d8d82fce556e0f4800f95044b7df4fcc6f98fb not found: ID does not exist" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.206235 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6325e48f-129d-4832-99d8-1cd8088708c3-webhook-cert\") pod \"6325e48f-129d-4832-99d8-1cd8088708c3\" (UID: \"6325e48f-129d-4832-99d8-1cd8088708c3\") " Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.206296 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5ml6\" (UniqueName: \"kubernetes.io/projected/6325e48f-129d-4832-99d8-1cd8088708c3-kube-api-access-c5ml6\") pod \"6325e48f-129d-4832-99d8-1cd8088708c3\" (UID: \"6325e48f-129d-4832-99d8-1cd8088708c3\") " Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.206448 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6325e48f-129d-4832-99d8-1cd8088708c3-apiservice-cert\") pod \"6325e48f-129d-4832-99d8-1cd8088708c3\" (UID: \"6325e48f-129d-4832-99d8-1cd8088708c3\") " Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.210932 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6325e48f-129d-4832-99d8-1cd8088708c3-kube-api-access-c5ml6" (OuterVolumeSpecName: "kube-api-access-c5ml6") pod "6325e48f-129d-4832-99d8-1cd8088708c3" (UID: "6325e48f-129d-4832-99d8-1cd8088708c3"). InnerVolumeSpecName "kube-api-access-c5ml6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.214033 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6325e48f-129d-4832-99d8-1cd8088708c3-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "6325e48f-129d-4832-99d8-1cd8088708c3" (UID: "6325e48f-129d-4832-99d8-1cd8088708c3"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.217704 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6325e48f-129d-4832-99d8-1cd8088708c3-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "6325e48f-129d-4832-99d8-1cd8088708c3" (UID: "6325e48f-129d-4832-99d8-1cd8088708c3"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.269650 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystonef9dc-account-delete-9skqx" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.308177 5072 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6325e48f-129d-4832-99d8-1cd8088708c3-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.308208 5072 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6325e48f-129d-4832-99d8-1cd8088708c3-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.308219 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5ml6\" (UniqueName: \"kubernetes.io/projected/6325e48f-129d-4832-99d8-1cd8088708c3-kube-api-access-c5ml6\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.332229 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-crv8s" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.408948 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7ac4fd8-0666-458f-8910-26abc36f0bdd-operator-scripts\") pod \"a7ac4fd8-0666-458f-8910-26abc36f0bdd\" (UID: \"a7ac4fd8-0666-458f-8910-26abc36f0bdd\") " Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.409068 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpckf\" (UniqueName: \"kubernetes.io/projected/f7ec1561-2733-469f-b4b4-13035f2557f0-kube-api-access-mpckf\") pod \"f7ec1561-2733-469f-b4b4-13035f2557f0\" (UID: \"f7ec1561-2733-469f-b4b4-13035f2557f0\") " Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.409086 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk79r\" (UniqueName: \"kubernetes.io/projected/a7ac4fd8-0666-458f-8910-26abc36f0bdd-kube-api-access-jk79r\") pod \"a7ac4fd8-0666-458f-8910-26abc36f0bdd\" (UID: \"a7ac4fd8-0666-458f-8910-26abc36f0bdd\") " Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.409404 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7ac4fd8-0666-458f-8910-26abc36f0bdd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a7ac4fd8-0666-458f-8910-26abc36f0bdd" (UID: "a7ac4fd8-0666-458f-8910-26abc36f0bdd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.412497 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7ac4fd8-0666-458f-8910-26abc36f0bdd-kube-api-access-jk79r" (OuterVolumeSpecName: "kube-api-access-jk79r") pod "a7ac4fd8-0666-458f-8910-26abc36f0bdd" (UID: "a7ac4fd8-0666-458f-8910-26abc36f0bdd"). InnerVolumeSpecName "kube-api-access-jk79r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.412717 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7ec1561-2733-469f-b4b4-13035f2557f0-kube-api-access-mpckf" (OuterVolumeSpecName: "kube-api-access-mpckf") pod "f7ec1561-2733-469f-b4b4-13035f2557f0" (UID: "f7ec1561-2733-469f-b4b4-13035f2557f0"). InnerVolumeSpecName "kube-api-access-mpckf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.484083 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-0" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.511042 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpckf\" (UniqueName: \"kubernetes.io/projected/f7ec1561-2733-469f-b4b4-13035f2557f0-kube-api-access-mpckf\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.511085 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk79r\" (UniqueName: \"kubernetes.io/projected/a7ac4fd8-0666-458f-8910-26abc36f0bdd-kube-api-access-jk79r\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.511097 5072 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7ac4fd8-0666-458f-8910-26abc36f0bdd-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.611582 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e56491ab-6d17-4127-a25b-75b5e900e0aa-operator-scripts\") pod \"e56491ab-6d17-4127-a25b-75b5e900e0aa\" (UID: \"e56491ab-6d17-4127-a25b-75b5e900e0aa\") " Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.611697 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e56491ab-6d17-4127-a25b-75b5e900e0aa-kolla-config\") pod \"e56491ab-6d17-4127-a25b-75b5e900e0aa\" (UID: \"e56491ab-6d17-4127-a25b-75b5e900e0aa\") " Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.611722 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"e56491ab-6d17-4127-a25b-75b5e900e0aa\" (UID: \"e56491ab-6d17-4127-a25b-75b5e900e0aa\") " Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.611754 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e56491ab-6d17-4127-a25b-75b5e900e0aa-config-data-default\") pod \"e56491ab-6d17-4127-a25b-75b5e900e0aa\" (UID: \"e56491ab-6d17-4127-a25b-75b5e900e0aa\") " Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.611777 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szqmd\" (UniqueName: \"kubernetes.io/projected/e56491ab-6d17-4127-a25b-75b5e900e0aa-kube-api-access-szqmd\") pod \"e56491ab-6d17-4127-a25b-75b5e900e0aa\" (UID: \"e56491ab-6d17-4127-a25b-75b5e900e0aa\") " Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.611835 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e56491ab-6d17-4127-a25b-75b5e900e0aa-config-data-generated\") pod \"e56491ab-6d17-4127-a25b-75b5e900e0aa\" (UID: \"e56491ab-6d17-4127-a25b-75b5e900e0aa\") " Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.612272 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e56491ab-6d17-4127-a25b-75b5e900e0aa-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "e56491ab-6d17-4127-a25b-75b5e900e0aa" (UID: "e56491ab-6d17-4127-a25b-75b5e900e0aa"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.612325 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e56491ab-6d17-4127-a25b-75b5e900e0aa-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "e56491ab-6d17-4127-a25b-75b5e900e0aa" (UID: "e56491ab-6d17-4127-a25b-75b5e900e0aa"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.612344 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e56491ab-6d17-4127-a25b-75b5e900e0aa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e56491ab-6d17-4127-a25b-75b5e900e0aa" (UID: "e56491ab-6d17-4127-a25b-75b5e900e0aa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.612511 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e56491ab-6d17-4127-a25b-75b5e900e0aa-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "e56491ab-6d17-4127-a25b-75b5e900e0aa" (UID: "e56491ab-6d17-4127-a25b-75b5e900e0aa"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.615132 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e56491ab-6d17-4127-a25b-75b5e900e0aa-kube-api-access-szqmd" (OuterVolumeSpecName: "kube-api-access-szqmd") pod "e56491ab-6d17-4127-a25b-75b5e900e0aa" (UID: "e56491ab-6d17-4127-a25b-75b5e900e0aa"). InnerVolumeSpecName "kube-api-access-szqmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.619592 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "mysql-db") pod "e56491ab-6d17-4127-a25b-75b5e900e0aa" (UID: "e56491ab-6d17-4127-a25b-75b5e900e0aa"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.713337 5072 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.713575 5072 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e56491ab-6d17-4127-a25b-75b5e900e0aa-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.713675 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szqmd\" (UniqueName: \"kubernetes.io/projected/e56491ab-6d17-4127-a25b-75b5e900e0aa-kube-api-access-szqmd\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.713754 5072 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e56491ab-6d17-4127-a25b-75b5e900e0aa-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.713808 5072 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e56491ab-6d17-4127-a25b-75b5e900e0aa-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.713860 5072 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e56491ab-6d17-4127-a25b-75b5e900e0aa-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.731480 5072 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.815845 5072 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.961336 5072 generic.go:334] "Generic (PLEG): container finished" podID="f7ec1561-2733-469f-b4b4-13035f2557f0" containerID="a24df8c3e94264b53b151a7d0b3dc470fd182808407e5bd0c999d4733247cd81" exitCode=0 Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.961419 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-crv8s" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.961423 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-crv8s" event={"ID":"f7ec1561-2733-469f-b4b4-13035f2557f0","Type":"ContainerDied","Data":"a24df8c3e94264b53b151a7d0b3dc470fd182808407e5bd0c999d4733247cd81"} Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.961498 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-crv8s" event={"ID":"f7ec1561-2733-469f-b4b4-13035f2557f0","Type":"ContainerDied","Data":"1e5481e986c0cf83a9d0bf6b39b68cf807559761b304457fcd5575cbdf66157b"} Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.961524 5072 scope.go:117] "RemoveContainer" containerID="a24df8c3e94264b53b151a7d0b3dc470fd182808407e5bd0c999d4733247cd81" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.966368 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-784c7fcf4f-fvf4q" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.967169 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-784c7fcf4f-fvf4q" event={"ID":"6325e48f-129d-4832-99d8-1cd8088708c3","Type":"ContainerDied","Data":"297f2d54391d1c87255c4371f6bee8d33f5f083f0dc1db2a939647c519af08c0"} Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.970237 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystonef9dc-account-delete-9skqx" event={"ID":"a7ac4fd8-0666-458f-8910-26abc36f0bdd","Type":"ContainerDied","Data":"123a786d3992065b81eb2ae8f45271411a0abbb965f7a496c0cc1281f86f8689"} Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.970297 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystonef9dc-account-delete-9skqx" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.978092 5072 generic.go:334] "Generic (PLEG): container finished" podID="e56491ab-6d17-4127-a25b-75b5e900e0aa" containerID="22923bc7fb6b00b2c45a6c2fe17bc3e8e1d8c9956b429372dfce46aca53cb8f7" exitCode=0 Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.978271 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-0" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.978297 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-0" event={"ID":"e56491ab-6d17-4127-a25b-75b5e900e0aa","Type":"ContainerDied","Data":"22923bc7fb6b00b2c45a6c2fe17bc3e8e1d8c9956b429372dfce46aca53cb8f7"} Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.980519 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-0" event={"ID":"e56491ab-6d17-4127-a25b-75b5e900e0aa","Type":"ContainerDied","Data":"940dd5a230baf5ea65ef57598a5bfd55ce14a6b0e23580326c809965c27f2c2d"} Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.991250 5072 scope.go:117] "RemoveContainer" containerID="a24df8c3e94264b53b151a7d0b3dc470fd182808407e5bd0c999d4733247cd81" Feb 28 04:31:51 crc kubenswrapper[5072]: E0228 04:31:51.991798 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a24df8c3e94264b53b151a7d0b3dc470fd182808407e5bd0c999d4733247cd81\": container with ID starting with a24df8c3e94264b53b151a7d0b3dc470fd182808407e5bd0c999d4733247cd81 not found: ID does not exist" containerID="a24df8c3e94264b53b151a7d0b3dc470fd182808407e5bd0c999d4733247cd81" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.991902 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a24df8c3e94264b53b151a7d0b3dc470fd182808407e5bd0c999d4733247cd81"} err="failed to get container status \"a24df8c3e94264b53b151a7d0b3dc470fd182808407e5bd0c999d4733247cd81\": rpc error: code = NotFound desc = could not find container \"a24df8c3e94264b53b151a7d0b3dc470fd182808407e5bd0c999d4733247cd81\": container with ID starting with a24df8c3e94264b53b151a7d0b3dc470fd182808407e5bd0c999d4733247cd81 not found: ID does not exist" Feb 28 04:31:51 crc kubenswrapper[5072]: I0228 04:31:51.991948 5072 scope.go:117] "RemoveContainer" containerID="9257be1e822cb1d78e5876a482a752924d94b6f97d2c75b12e040888675fc056" Feb 28 04:31:52 crc kubenswrapper[5072]: I0228 04:31:52.009814 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-784c7fcf4f-fvf4q"] Feb 28 04:31:52 crc kubenswrapper[5072]: I0228 04:31:52.016972 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-784c7fcf4f-fvf4q"] Feb 28 04:31:52 crc kubenswrapper[5072]: I0228 04:31:52.017039 5072 scope.go:117] "RemoveContainer" containerID="5daf3fa507a64e689873e84ff4000129257acc16dbd21ecf24dfe01da2c51df4" Feb 28 04:31:52 crc kubenswrapper[5072]: I0228 04:31:52.043500 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-crv8s"] Feb 28 04:31:52 crc kubenswrapper[5072]: I0228 04:31:52.045359 5072 scope.go:117] "RemoveContainer" containerID="22923bc7fb6b00b2c45a6c2fe17bc3e8e1d8c9956b429372dfce46aca53cb8f7" Feb 28 04:31:52 crc kubenswrapper[5072]: I0228 04:31:52.050836 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-index-crv8s"] Feb 28 04:31:52 crc kubenswrapper[5072]: I0228 04:31:52.059205 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/openstack-galera-0"] Feb 28 04:31:52 crc kubenswrapper[5072]: I0228 04:31:52.063632 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/openstack-galera-0"] Feb 28 04:31:52 crc kubenswrapper[5072]: I0228 04:31:52.068150 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/keystonef9dc-account-delete-9skqx"] Feb 28 04:31:52 crc kubenswrapper[5072]: I0228 04:31:52.069325 5072 scope.go:117] "RemoveContainer" containerID="6e8560317d9c9bf7f2fe0437bb4714865b3a34fb141398d91668d549d485296a" Feb 28 04:31:52 crc kubenswrapper[5072]: I0228 04:31:52.073576 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/keystonef9dc-account-delete-9skqx"] Feb 28 04:31:52 crc kubenswrapper[5072]: I0228 04:31:52.108135 5072 scope.go:117] "RemoveContainer" containerID="22923bc7fb6b00b2c45a6c2fe17bc3e8e1d8c9956b429372dfce46aca53cb8f7" Feb 28 04:31:52 crc kubenswrapper[5072]: E0228 04:31:52.108717 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22923bc7fb6b00b2c45a6c2fe17bc3e8e1d8c9956b429372dfce46aca53cb8f7\": container with ID starting with 22923bc7fb6b00b2c45a6c2fe17bc3e8e1d8c9956b429372dfce46aca53cb8f7 not found: ID does not exist" containerID="22923bc7fb6b00b2c45a6c2fe17bc3e8e1d8c9956b429372dfce46aca53cb8f7" Feb 28 04:31:52 crc kubenswrapper[5072]: I0228 04:31:52.108762 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22923bc7fb6b00b2c45a6c2fe17bc3e8e1d8c9956b429372dfce46aca53cb8f7"} err="failed to get container status \"22923bc7fb6b00b2c45a6c2fe17bc3e8e1d8c9956b429372dfce46aca53cb8f7\": rpc error: code = NotFound desc = could not find container \"22923bc7fb6b00b2c45a6c2fe17bc3e8e1d8c9956b429372dfce46aca53cb8f7\": container with ID starting with 22923bc7fb6b00b2c45a6c2fe17bc3e8e1d8c9956b429372dfce46aca53cb8f7 not found: ID does not exist" Feb 28 04:31:52 crc kubenswrapper[5072]: I0228 04:31:52.108792 5072 scope.go:117] "RemoveContainer" containerID="6e8560317d9c9bf7f2fe0437bb4714865b3a34fb141398d91668d549d485296a" Feb 28 04:31:52 crc kubenswrapper[5072]: E0228 04:31:52.109109 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e8560317d9c9bf7f2fe0437bb4714865b3a34fb141398d91668d549d485296a\": container with ID starting with 6e8560317d9c9bf7f2fe0437bb4714865b3a34fb141398d91668d549d485296a not found: ID does not exist" containerID="6e8560317d9c9bf7f2fe0437bb4714865b3a34fb141398d91668d549d485296a" Feb 28 04:31:52 crc kubenswrapper[5072]: I0228 04:31:52.109134 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e8560317d9c9bf7f2fe0437bb4714865b3a34fb141398d91668d549d485296a"} err="failed to get container status \"6e8560317d9c9bf7f2fe0437bb4714865b3a34fb141398d91668d549d485296a\": rpc error: code = NotFound desc = could not find container \"6e8560317d9c9bf7f2fe0437bb4714865b3a34fb141398d91668d549d485296a\": container with ID starting with 6e8560317d9c9bf7f2fe0437bb4714865b3a34fb141398d91668d549d485296a not found: ID does not exist" Feb 28 04:31:52 crc kubenswrapper[5072]: I0228 04:31:52.672093 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46bb064f-a1bd-40b6-baaf-0ed5f71c926d" path="/var/lib/kubelet/pods/46bb064f-a1bd-40b6-baaf-0ed5f71c926d/volumes" Feb 28 04:31:52 crc kubenswrapper[5072]: I0228 04:31:52.673935 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6325e48f-129d-4832-99d8-1cd8088708c3" path="/var/lib/kubelet/pods/6325e48f-129d-4832-99d8-1cd8088708c3/volumes" Feb 28 04:31:52 crc kubenswrapper[5072]: I0228 04:31:52.675159 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8171cc83-a178-4d19-b1c5-0d93b123838c" path="/var/lib/kubelet/pods/8171cc83-a178-4d19-b1c5-0d93b123838c/volumes" Feb 28 04:31:52 crc kubenswrapper[5072]: I0228 04:31:52.677508 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7ac4fd8-0666-458f-8910-26abc36f0bdd" path="/var/lib/kubelet/pods/a7ac4fd8-0666-458f-8910-26abc36f0bdd/volumes" Feb 28 04:31:52 crc kubenswrapper[5072]: I0228 04:31:52.678931 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e56491ab-6d17-4127-a25b-75b5e900e0aa" path="/var/lib/kubelet/pods/e56491ab-6d17-4127-a25b-75b5e900e0aa/volumes" Feb 28 04:31:52 crc kubenswrapper[5072]: I0228 04:31:52.680126 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7ec1561-2733-469f-b4b4-13035f2557f0" path="/var/lib/kubelet/pods/f7ec1561-2733-469f-b4b4-13035f2557f0/volumes" Feb 28 04:31:53 crc kubenswrapper[5072]: I0228 04:31:53.505426 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-bt7bg"] Feb 28 04:31:53 crc kubenswrapper[5072]: I0228 04:31:53.506115 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-bt7bg" podUID="373e4c12-ee6c-4f89-b684-fb8e61d18c9f" containerName="operator" containerID="cri-o://00754db9a535f2838f78ab7b42d9c0c94fb32d10f69d5fcf7088dac9088405ef" gracePeriod=10 Feb 28 04:31:53 crc kubenswrapper[5072]: I0228 04:31:53.832001 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-9g9sv"] Feb 28 04:31:53 crc kubenswrapper[5072]: I0228 04:31:53.832692 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-index-9g9sv" podUID="c35bd0a7-4cba-4185-a45c-bfaf82c04638" containerName="registry-server" containerID="cri-o://b90943e538e0b47e4b23feddf1a30bf60bffbce8de498186a40175118f72a618" gracePeriod=30 Feb 28 04:31:53 crc kubenswrapper[5072]: I0228 04:31:53.859715 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj"] Feb 28 04:31:53 crc kubenswrapper[5072]: I0228 04:31:53.867107 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590pxqxj"] Feb 28 04:31:53 crc kubenswrapper[5072]: I0228 04:31:53.981756 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-bt7bg" Feb 28 04:31:54 crc kubenswrapper[5072]: I0228 04:31:54.053205 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhzml\" (UniqueName: \"kubernetes.io/projected/373e4c12-ee6c-4f89-b684-fb8e61d18c9f-kube-api-access-vhzml\") pod \"373e4c12-ee6c-4f89-b684-fb8e61d18c9f\" (UID: \"373e4c12-ee6c-4f89-b684-fb8e61d18c9f\") " Feb 28 04:31:54 crc kubenswrapper[5072]: I0228 04:31:54.059812 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/373e4c12-ee6c-4f89-b684-fb8e61d18c9f-kube-api-access-vhzml" (OuterVolumeSpecName: "kube-api-access-vhzml") pod "373e4c12-ee6c-4f89-b684-fb8e61d18c9f" (UID: "373e4c12-ee6c-4f89-b684-fb8e61d18c9f"). InnerVolumeSpecName "kube-api-access-vhzml". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:31:54 crc kubenswrapper[5072]: I0228 04:31:54.078579 5072 generic.go:334] "Generic (PLEG): container finished" podID="c35bd0a7-4cba-4185-a45c-bfaf82c04638" containerID="b90943e538e0b47e4b23feddf1a30bf60bffbce8de498186a40175118f72a618" exitCode=0 Feb 28 04:31:54 crc kubenswrapper[5072]: I0228 04:31:54.078658 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-9g9sv" event={"ID":"c35bd0a7-4cba-4185-a45c-bfaf82c04638","Type":"ContainerDied","Data":"b90943e538e0b47e4b23feddf1a30bf60bffbce8de498186a40175118f72a618"} Feb 28 04:31:54 crc kubenswrapper[5072]: I0228 04:31:54.083900 5072 generic.go:334] "Generic (PLEG): container finished" podID="373e4c12-ee6c-4f89-b684-fb8e61d18c9f" containerID="00754db9a535f2838f78ab7b42d9c0c94fb32d10f69d5fcf7088dac9088405ef" exitCode=0 Feb 28 04:31:54 crc kubenswrapper[5072]: I0228 04:31:54.083945 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-bt7bg" event={"ID":"373e4c12-ee6c-4f89-b684-fb8e61d18c9f","Type":"ContainerDied","Data":"00754db9a535f2838f78ab7b42d9c0c94fb32d10f69d5fcf7088dac9088405ef"} Feb 28 04:31:54 crc kubenswrapper[5072]: I0228 04:31:54.083976 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-bt7bg" event={"ID":"373e4c12-ee6c-4f89-b684-fb8e61d18c9f","Type":"ContainerDied","Data":"9cc7788cfbbf39ed7e3f8db499050cb547b13be135eaeaa59e64e0b95df25532"} Feb 28 04:31:54 crc kubenswrapper[5072]: I0228 04:31:54.083996 5072 scope.go:117] "RemoveContainer" containerID="00754db9a535f2838f78ab7b42d9c0c94fb32d10f69d5fcf7088dac9088405ef" Feb 28 04:31:54 crc kubenswrapper[5072]: I0228 04:31:54.084125 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-bt7bg" Feb 28 04:31:54 crc kubenswrapper[5072]: I0228 04:31:54.107367 5072 scope.go:117] "RemoveContainer" containerID="00754db9a535f2838f78ab7b42d9c0c94fb32d10f69d5fcf7088dac9088405ef" Feb 28 04:31:54 crc kubenswrapper[5072]: E0228 04:31:54.110127 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00754db9a535f2838f78ab7b42d9c0c94fb32d10f69d5fcf7088dac9088405ef\": container with ID starting with 00754db9a535f2838f78ab7b42d9c0c94fb32d10f69d5fcf7088dac9088405ef not found: ID does not exist" containerID="00754db9a535f2838f78ab7b42d9c0c94fb32d10f69d5fcf7088dac9088405ef" Feb 28 04:31:54 crc kubenswrapper[5072]: I0228 04:31:54.110181 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00754db9a535f2838f78ab7b42d9c0c94fb32d10f69d5fcf7088dac9088405ef"} err="failed to get container status \"00754db9a535f2838f78ab7b42d9c0c94fb32d10f69d5fcf7088dac9088405ef\": rpc error: code = NotFound desc = could not find container \"00754db9a535f2838f78ab7b42d9c0c94fb32d10f69d5fcf7088dac9088405ef\": container with ID starting with 00754db9a535f2838f78ab7b42d9c0c94fb32d10f69d5fcf7088dac9088405ef not found: ID does not exist" Feb 28 04:31:54 crc kubenswrapper[5072]: I0228 04:31:54.120878 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-bt7bg"] Feb 28 04:31:54 crc kubenswrapper[5072]: I0228 04:31:54.120936 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-bt7bg"] Feb 28 04:31:54 crc kubenswrapper[5072]: I0228 04:31:54.154501 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhzml\" (UniqueName: \"kubernetes.io/projected/373e4c12-ee6c-4f89-b684-fb8e61d18c9f-kube-api-access-vhzml\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:54 crc kubenswrapper[5072]: I0228 04:31:54.214623 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-9g9sv" Feb 28 04:31:54 crc kubenswrapper[5072]: I0228 04:31:54.262322 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7bcf\" (UniqueName: \"kubernetes.io/projected/c35bd0a7-4cba-4185-a45c-bfaf82c04638-kube-api-access-m7bcf\") pod \"c35bd0a7-4cba-4185-a45c-bfaf82c04638\" (UID: \"c35bd0a7-4cba-4185-a45c-bfaf82c04638\") " Feb 28 04:31:54 crc kubenswrapper[5072]: I0228 04:31:54.266223 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c35bd0a7-4cba-4185-a45c-bfaf82c04638-kube-api-access-m7bcf" (OuterVolumeSpecName: "kube-api-access-m7bcf") pod "c35bd0a7-4cba-4185-a45c-bfaf82c04638" (UID: "c35bd0a7-4cba-4185-a45c-bfaf82c04638"). InnerVolumeSpecName "kube-api-access-m7bcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:31:54 crc kubenswrapper[5072]: I0228 04:31:54.364525 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7bcf\" (UniqueName: \"kubernetes.io/projected/c35bd0a7-4cba-4185-a45c-bfaf82c04638-kube-api-access-m7bcf\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:54 crc kubenswrapper[5072]: I0228 04:31:54.667428 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="373e4c12-ee6c-4f89-b684-fb8e61d18c9f" path="/var/lib/kubelet/pods/373e4c12-ee6c-4f89-b684-fb8e61d18c9f/volumes" Feb 28 04:31:54 crc kubenswrapper[5072]: I0228 04:31:54.668857 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f246c68d-2d24-48f5-9e70-7286730298f3" path="/var/lib/kubelet/pods/f246c68d-2d24-48f5-9e70-7286730298f3/volumes" Feb 28 04:31:55 crc kubenswrapper[5072]: I0228 04:31:55.092381 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-9g9sv" Feb 28 04:31:55 crc kubenswrapper[5072]: I0228 04:31:55.092335 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-9g9sv" event={"ID":"c35bd0a7-4cba-4185-a45c-bfaf82c04638","Type":"ContainerDied","Data":"bcdcba18b6737963fceb481c95cf1c03046dc7d97b3e63fd3f7505c33f6a9d52"} Feb 28 04:31:55 crc kubenswrapper[5072]: I0228 04:31:55.092580 5072 scope.go:117] "RemoveContainer" containerID="b90943e538e0b47e4b23feddf1a30bf60bffbce8de498186a40175118f72a618" Feb 28 04:31:55 crc kubenswrapper[5072]: I0228 04:31:55.110979 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-9g9sv"] Feb 28 04:31:55 crc kubenswrapper[5072]: I0228 04:31:55.116745 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-9g9sv"] Feb 28 04:31:55 crc kubenswrapper[5072]: I0228 04:31:55.994439 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-8c869c9f9-dnsxj"] Feb 28 04:31:55 crc kubenswrapper[5072]: I0228 04:31:55.995231 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-controller-manager-8c869c9f9-dnsxj" podUID="23fe0761-ad11-4ccf-9511-2c074bed0915" containerName="manager" containerID="cri-o://25c3d12861e02db8ba4c769ef2860dce9708121ac8e36e8c8df8a17ecfd1effa" gracePeriod=10 Feb 28 04:31:56 crc kubenswrapper[5072]: I0228 04:31:56.308491 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-ss2pk"] Feb 28 04:31:56 crc kubenswrapper[5072]: I0228 04:31:56.308736 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-ss2pk" podUID="cb49b1ff-eed4-41e8-a6e2-5b1514499d41" containerName="registry-server" containerID="cri-o://bc7db28d114c84e78b81aea7fef6c8ead4f3a486d05bfcbe325587bba3356109" gracePeriod=30 Feb 28 04:31:56 crc kubenswrapper[5072]: I0228 04:31:56.332344 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl"] Feb 28 04:31:56 crc kubenswrapper[5072]: I0228 04:31:56.335707 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779039xkbl"] Feb 28 04:31:56 crc kubenswrapper[5072]: I0228 04:31:56.448505 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-8c869c9f9-dnsxj" Feb 28 04:31:56 crc kubenswrapper[5072]: I0228 04:31:56.490247 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/23fe0761-ad11-4ccf-9511-2c074bed0915-apiservice-cert\") pod \"23fe0761-ad11-4ccf-9511-2c074bed0915\" (UID: \"23fe0761-ad11-4ccf-9511-2c074bed0915\") " Feb 28 04:31:56 crc kubenswrapper[5072]: I0228 04:31:56.490366 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kw6tm\" (UniqueName: \"kubernetes.io/projected/23fe0761-ad11-4ccf-9511-2c074bed0915-kube-api-access-kw6tm\") pod \"23fe0761-ad11-4ccf-9511-2c074bed0915\" (UID: \"23fe0761-ad11-4ccf-9511-2c074bed0915\") " Feb 28 04:31:56 crc kubenswrapper[5072]: I0228 04:31:56.490574 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/23fe0761-ad11-4ccf-9511-2c074bed0915-webhook-cert\") pod \"23fe0761-ad11-4ccf-9511-2c074bed0915\" (UID: \"23fe0761-ad11-4ccf-9511-2c074bed0915\") " Feb 28 04:31:56 crc kubenswrapper[5072]: I0228 04:31:56.495154 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23fe0761-ad11-4ccf-9511-2c074bed0915-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "23fe0761-ad11-4ccf-9511-2c074bed0915" (UID: "23fe0761-ad11-4ccf-9511-2c074bed0915"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:31:56 crc kubenswrapper[5072]: I0228 04:31:56.495291 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23fe0761-ad11-4ccf-9511-2c074bed0915-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "23fe0761-ad11-4ccf-9511-2c074bed0915" (UID: "23fe0761-ad11-4ccf-9511-2c074bed0915"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:31:56 crc kubenswrapper[5072]: I0228 04:31:56.495476 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23fe0761-ad11-4ccf-9511-2c074bed0915-kube-api-access-kw6tm" (OuterVolumeSpecName: "kube-api-access-kw6tm") pod "23fe0761-ad11-4ccf-9511-2c074bed0915" (UID: "23fe0761-ad11-4ccf-9511-2c074bed0915"). InnerVolumeSpecName "kube-api-access-kw6tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:31:56 crc kubenswrapper[5072]: I0228 04:31:56.591987 5072 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/23fe0761-ad11-4ccf-9511-2c074bed0915-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:56 crc kubenswrapper[5072]: I0228 04:31:56.592021 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kw6tm\" (UniqueName: \"kubernetes.io/projected/23fe0761-ad11-4ccf-9511-2c074bed0915-kube-api-access-kw6tm\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:56 crc kubenswrapper[5072]: I0228 04:31:56.592032 5072 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/23fe0761-ad11-4ccf-9511-2c074bed0915-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:56 crc kubenswrapper[5072]: I0228 04:31:56.651069 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-ss2pk" Feb 28 04:31:56 crc kubenswrapper[5072]: I0228 04:31:56.669956 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05f93606-b8ad-4fbe-b8dc-b7c9722e450c" path="/var/lib/kubelet/pods/05f93606-b8ad-4fbe-b8dc-b7c9722e450c/volumes" Feb 28 04:31:56 crc kubenswrapper[5072]: I0228 04:31:56.670539 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c35bd0a7-4cba-4185-a45c-bfaf82c04638" path="/var/lib/kubelet/pods/c35bd0a7-4cba-4185-a45c-bfaf82c04638/volumes" Feb 28 04:31:56 crc kubenswrapper[5072]: I0228 04:31:56.694980 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn4bj\" (UniqueName: \"kubernetes.io/projected/cb49b1ff-eed4-41e8-a6e2-5b1514499d41-kube-api-access-pn4bj\") pod \"cb49b1ff-eed4-41e8-a6e2-5b1514499d41\" (UID: \"cb49b1ff-eed4-41e8-a6e2-5b1514499d41\") " Feb 28 04:31:56 crc kubenswrapper[5072]: I0228 04:31:56.698791 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb49b1ff-eed4-41e8-a6e2-5b1514499d41-kube-api-access-pn4bj" (OuterVolumeSpecName: "kube-api-access-pn4bj") pod "cb49b1ff-eed4-41e8-a6e2-5b1514499d41" (UID: "cb49b1ff-eed4-41e8-a6e2-5b1514499d41"). InnerVolumeSpecName "kube-api-access-pn4bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:31:56 crc kubenswrapper[5072]: I0228 04:31:56.797398 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pn4bj\" (UniqueName: \"kubernetes.io/projected/cb49b1ff-eed4-41e8-a6e2-5b1514499d41-kube-api-access-pn4bj\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:57 crc kubenswrapper[5072]: I0228 04:31:57.109710 5072 generic.go:334] "Generic (PLEG): container finished" podID="23fe0761-ad11-4ccf-9511-2c074bed0915" containerID="25c3d12861e02db8ba4c769ef2860dce9708121ac8e36e8c8df8a17ecfd1effa" exitCode=0 Feb 28 04:31:57 crc kubenswrapper[5072]: I0228 04:31:57.109788 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-8c869c9f9-dnsxj" event={"ID":"23fe0761-ad11-4ccf-9511-2c074bed0915","Type":"ContainerDied","Data":"25c3d12861e02db8ba4c769ef2860dce9708121ac8e36e8c8df8a17ecfd1effa"} Feb 28 04:31:57 crc kubenswrapper[5072]: I0228 04:31:57.109819 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-8c869c9f9-dnsxj" event={"ID":"23fe0761-ad11-4ccf-9511-2c074bed0915","Type":"ContainerDied","Data":"2b4eb3fb04285d1bf97e1c24b4543d0ed3c9548abf47cc22df098795bf07120d"} Feb 28 04:31:57 crc kubenswrapper[5072]: I0228 04:31:57.109819 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-8c869c9f9-dnsxj" Feb 28 04:31:57 crc kubenswrapper[5072]: I0228 04:31:57.109839 5072 scope.go:117] "RemoveContainer" containerID="25c3d12861e02db8ba4c769ef2860dce9708121ac8e36e8c8df8a17ecfd1effa" Feb 28 04:31:57 crc kubenswrapper[5072]: I0228 04:31:57.112295 5072 generic.go:334] "Generic (PLEG): container finished" podID="cb49b1ff-eed4-41e8-a6e2-5b1514499d41" containerID="bc7db28d114c84e78b81aea7fef6c8ead4f3a486d05bfcbe325587bba3356109" exitCode=0 Feb 28 04:31:57 crc kubenswrapper[5072]: I0228 04:31:57.112326 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-ss2pk" event={"ID":"cb49b1ff-eed4-41e8-a6e2-5b1514499d41","Type":"ContainerDied","Data":"bc7db28d114c84e78b81aea7fef6c8ead4f3a486d05bfcbe325587bba3356109"} Feb 28 04:31:57 crc kubenswrapper[5072]: I0228 04:31:57.112352 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-ss2pk" event={"ID":"cb49b1ff-eed4-41e8-a6e2-5b1514499d41","Type":"ContainerDied","Data":"33a482e26fd92d69b5a943ab156012bb1f0b2376f21b2e4da2149883d6e2f451"} Feb 28 04:31:57 crc kubenswrapper[5072]: I0228 04:31:57.112366 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-ss2pk" Feb 28 04:31:57 crc kubenswrapper[5072]: I0228 04:31:57.163021 5072 scope.go:117] "RemoveContainer" containerID="25c3d12861e02db8ba4c769ef2860dce9708121ac8e36e8c8df8a17ecfd1effa" Feb 28 04:31:57 crc kubenswrapper[5072]: E0228 04:31:57.163569 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25c3d12861e02db8ba4c769ef2860dce9708121ac8e36e8c8df8a17ecfd1effa\": container with ID starting with 25c3d12861e02db8ba4c769ef2860dce9708121ac8e36e8c8df8a17ecfd1effa not found: ID does not exist" containerID="25c3d12861e02db8ba4c769ef2860dce9708121ac8e36e8c8df8a17ecfd1effa" Feb 28 04:31:57 crc kubenswrapper[5072]: I0228 04:31:57.163601 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25c3d12861e02db8ba4c769ef2860dce9708121ac8e36e8c8df8a17ecfd1effa"} err="failed to get container status \"25c3d12861e02db8ba4c769ef2860dce9708121ac8e36e8c8df8a17ecfd1effa\": rpc error: code = NotFound desc = could not find container \"25c3d12861e02db8ba4c769ef2860dce9708121ac8e36e8c8df8a17ecfd1effa\": container with ID starting with 25c3d12861e02db8ba4c769ef2860dce9708121ac8e36e8c8df8a17ecfd1effa not found: ID does not exist" Feb 28 04:31:57 crc kubenswrapper[5072]: I0228 04:31:57.163626 5072 scope.go:117] "RemoveContainer" containerID="bc7db28d114c84e78b81aea7fef6c8ead4f3a486d05bfcbe325587bba3356109" Feb 28 04:31:57 crc kubenswrapper[5072]: I0228 04:31:57.179089 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-8c869c9f9-dnsxj"] Feb 28 04:31:57 crc kubenswrapper[5072]: I0228 04:31:57.188223 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-controller-manager-8c869c9f9-dnsxj"] Feb 28 04:31:57 crc kubenswrapper[5072]: I0228 04:31:57.193023 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-ss2pk"] Feb 28 04:31:57 crc kubenswrapper[5072]: I0228 04:31:57.199200 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-ss2pk"] Feb 28 04:31:57 crc kubenswrapper[5072]: I0228 04:31:57.208828 5072 scope.go:117] "RemoveContainer" containerID="bc7db28d114c84e78b81aea7fef6c8ead4f3a486d05bfcbe325587bba3356109" Feb 28 04:31:57 crc kubenswrapper[5072]: E0228 04:31:57.212780 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc7db28d114c84e78b81aea7fef6c8ead4f3a486d05bfcbe325587bba3356109\": container with ID starting with bc7db28d114c84e78b81aea7fef6c8ead4f3a486d05bfcbe325587bba3356109 not found: ID does not exist" containerID="bc7db28d114c84e78b81aea7fef6c8ead4f3a486d05bfcbe325587bba3356109" Feb 28 04:31:57 crc kubenswrapper[5072]: I0228 04:31:57.212833 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc7db28d114c84e78b81aea7fef6c8ead4f3a486d05bfcbe325587bba3356109"} err="failed to get container status \"bc7db28d114c84e78b81aea7fef6c8ead4f3a486d05bfcbe325587bba3356109\": rpc error: code = NotFound desc = could not find container \"bc7db28d114c84e78b81aea7fef6c8ead4f3a486d05bfcbe325587bba3356109\": container with ID starting with bc7db28d114c84e78b81aea7fef6c8ead4f3a486d05bfcbe325587bba3356109 not found: ID does not exist" Feb 28 04:31:57 crc kubenswrapper[5072]: I0228 04:31:57.900261 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56f56f4fcc-2h9bj"] Feb 28 04:31:57 crc kubenswrapper[5072]: I0228 04:31:57.900456 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-controller-manager-56f56f4fcc-2h9bj" podUID="f3f522d3-98e9-446c-985e-01fbeb36f25d" containerName="manager" containerID="cri-o://9911e216c4f0525d9dd024a0158c92fa17cd88d2d765c7bfc3453693f6524216" gracePeriod=10 Feb 28 04:31:58 crc kubenswrapper[5072]: I0228 04:31:58.128195 5072 generic.go:334] "Generic (PLEG): container finished" podID="f3f522d3-98e9-446c-985e-01fbeb36f25d" containerID="9911e216c4f0525d9dd024a0158c92fa17cd88d2d765c7bfc3453693f6524216" exitCode=0 Feb 28 04:31:58 crc kubenswrapper[5072]: I0228 04:31:58.128263 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56f56f4fcc-2h9bj" event={"ID":"f3f522d3-98e9-446c-985e-01fbeb36f25d","Type":"ContainerDied","Data":"9911e216c4f0525d9dd024a0158c92fa17cd88d2d765c7bfc3453693f6524216"} Feb 28 04:31:58 crc kubenswrapper[5072]: I0228 04:31:58.261822 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-vqstk"] Feb 28 04:31:58 crc kubenswrapper[5072]: I0228 04:31:58.262177 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-vqstk" podUID="aea9870e-2b10-4d66-b478-023e2aed2ced" containerName="registry-server" containerID="cri-o://131e52691f446d27a9a6d41517e95e12876a80120d0a01bf76cce155af739000" gracePeriod=30 Feb 28 04:31:58 crc kubenswrapper[5072]: I0228 04:31:58.298735 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx"] Feb 28 04:31:58 crc kubenswrapper[5072]: I0228 04:31:58.299787 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dc8ffhx"] Feb 28 04:31:58 crc kubenswrapper[5072]: I0228 04:31:58.363992 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56f56f4fcc-2h9bj" Feb 28 04:31:58 crc kubenswrapper[5072]: I0228 04:31:58.419686 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f3f522d3-98e9-446c-985e-01fbeb36f25d-apiservice-cert\") pod \"f3f522d3-98e9-446c-985e-01fbeb36f25d\" (UID: \"f3f522d3-98e9-446c-985e-01fbeb36f25d\") " Feb 28 04:31:58 crc kubenswrapper[5072]: I0228 04:31:58.419765 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjzh5\" (UniqueName: \"kubernetes.io/projected/f3f522d3-98e9-446c-985e-01fbeb36f25d-kube-api-access-pjzh5\") pod \"f3f522d3-98e9-446c-985e-01fbeb36f25d\" (UID: \"f3f522d3-98e9-446c-985e-01fbeb36f25d\") " Feb 28 04:31:58 crc kubenswrapper[5072]: I0228 04:31:58.419870 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f3f522d3-98e9-446c-985e-01fbeb36f25d-webhook-cert\") pod \"f3f522d3-98e9-446c-985e-01fbeb36f25d\" (UID: \"f3f522d3-98e9-446c-985e-01fbeb36f25d\") " Feb 28 04:31:58 crc kubenswrapper[5072]: I0228 04:31:58.424422 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3f522d3-98e9-446c-985e-01fbeb36f25d-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "f3f522d3-98e9-446c-985e-01fbeb36f25d" (UID: "f3f522d3-98e9-446c-985e-01fbeb36f25d"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:31:58 crc kubenswrapper[5072]: I0228 04:31:58.429329 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3f522d3-98e9-446c-985e-01fbeb36f25d-kube-api-access-pjzh5" (OuterVolumeSpecName: "kube-api-access-pjzh5") pod "f3f522d3-98e9-446c-985e-01fbeb36f25d" (UID: "f3f522d3-98e9-446c-985e-01fbeb36f25d"). InnerVolumeSpecName "kube-api-access-pjzh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:31:58 crc kubenswrapper[5072]: I0228 04:31:58.429933 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3f522d3-98e9-446c-985e-01fbeb36f25d-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "f3f522d3-98e9-446c-985e-01fbeb36f25d" (UID: "f3f522d3-98e9-446c-985e-01fbeb36f25d"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:31:58 crc kubenswrapper[5072]: I0228 04:31:58.521845 5072 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f3f522d3-98e9-446c-985e-01fbeb36f25d-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:58 crc kubenswrapper[5072]: I0228 04:31:58.521888 5072 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f3f522d3-98e9-446c-985e-01fbeb36f25d-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:58 crc kubenswrapper[5072]: I0228 04:31:58.521904 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjzh5\" (UniqueName: \"kubernetes.io/projected/f3f522d3-98e9-446c-985e-01fbeb36f25d-kube-api-access-pjzh5\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:58 crc kubenswrapper[5072]: I0228 04:31:58.637238 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-vqstk" Feb 28 04:31:58 crc kubenswrapper[5072]: I0228 04:31:58.668132 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23fe0761-ad11-4ccf-9511-2c074bed0915" path="/var/lib/kubelet/pods/23fe0761-ad11-4ccf-9511-2c074bed0915/volumes" Feb 28 04:31:58 crc kubenswrapper[5072]: I0228 04:31:58.668973 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5777f06f-eab2-41eb-8b38-f1255369da51" path="/var/lib/kubelet/pods/5777f06f-eab2-41eb-8b38-f1255369da51/volumes" Feb 28 04:31:58 crc kubenswrapper[5072]: I0228 04:31:58.669767 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb49b1ff-eed4-41e8-a6e2-5b1514499d41" path="/var/lib/kubelet/pods/cb49b1ff-eed4-41e8-a6e2-5b1514499d41/volumes" Feb 28 04:31:58 crc kubenswrapper[5072]: I0228 04:31:58.725219 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6tww\" (UniqueName: \"kubernetes.io/projected/aea9870e-2b10-4d66-b478-023e2aed2ced-kube-api-access-k6tww\") pod \"aea9870e-2b10-4d66-b478-023e2aed2ced\" (UID: \"aea9870e-2b10-4d66-b478-023e2aed2ced\") " Feb 28 04:31:58 crc kubenswrapper[5072]: I0228 04:31:58.728612 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aea9870e-2b10-4d66-b478-023e2aed2ced-kube-api-access-k6tww" (OuterVolumeSpecName: "kube-api-access-k6tww") pod "aea9870e-2b10-4d66-b478-023e2aed2ced" (UID: "aea9870e-2b10-4d66-b478-023e2aed2ced"). InnerVolumeSpecName "kube-api-access-k6tww". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:31:58 crc kubenswrapper[5072]: I0228 04:31:58.826360 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6tww\" (UniqueName: \"kubernetes.io/projected/aea9870e-2b10-4d66-b478-023e2aed2ced-kube-api-access-k6tww\") on node \"crc\" DevicePath \"\"" Feb 28 04:31:59 crc kubenswrapper[5072]: I0228 04:31:59.134367 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56f56f4fcc-2h9bj" event={"ID":"f3f522d3-98e9-446c-985e-01fbeb36f25d","Type":"ContainerDied","Data":"54d4311087c4b96ff26f0518495676996fc50296f80594220bed2ec2096fed08"} Feb 28 04:31:59 crc kubenswrapper[5072]: I0228 04:31:59.134409 5072 scope.go:117] "RemoveContainer" containerID="9911e216c4f0525d9dd024a0158c92fa17cd88d2d765c7bfc3453693f6524216" Feb 28 04:31:59 crc kubenswrapper[5072]: I0228 04:31:59.134435 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56f56f4fcc-2h9bj" Feb 28 04:31:59 crc kubenswrapper[5072]: I0228 04:31:59.137670 5072 generic.go:334] "Generic (PLEG): container finished" podID="aea9870e-2b10-4d66-b478-023e2aed2ced" containerID="131e52691f446d27a9a6d41517e95e12876a80120d0a01bf76cce155af739000" exitCode=0 Feb 28 04:31:59 crc kubenswrapper[5072]: I0228 04:31:59.137703 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-vqstk" Feb 28 04:31:59 crc kubenswrapper[5072]: I0228 04:31:59.137902 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-vqstk" event={"ID":"aea9870e-2b10-4d66-b478-023e2aed2ced","Type":"ContainerDied","Data":"131e52691f446d27a9a6d41517e95e12876a80120d0a01bf76cce155af739000"} Feb 28 04:31:59 crc kubenswrapper[5072]: I0228 04:31:59.137996 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-vqstk" event={"ID":"aea9870e-2b10-4d66-b478-023e2aed2ced","Type":"ContainerDied","Data":"9e3ba8aaa01e28888ed953fd41037459e7a3155edd2c94e229562584b851d4f0"} Feb 28 04:31:59 crc kubenswrapper[5072]: I0228 04:31:59.150340 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56f56f4fcc-2h9bj"] Feb 28 04:31:59 crc kubenswrapper[5072]: I0228 04:31:59.153551 5072 scope.go:117] "RemoveContainer" containerID="131e52691f446d27a9a6d41517e95e12876a80120d0a01bf76cce155af739000" Feb 28 04:31:59 crc kubenswrapper[5072]: I0228 04:31:59.156708 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56f56f4fcc-2h9bj"] Feb 28 04:31:59 crc kubenswrapper[5072]: I0228 04:31:59.167770 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-vqstk"] Feb 28 04:31:59 crc kubenswrapper[5072]: I0228 04:31:59.168840 5072 scope.go:117] "RemoveContainer" containerID="131e52691f446d27a9a6d41517e95e12876a80120d0a01bf76cce155af739000" Feb 28 04:31:59 crc kubenswrapper[5072]: E0228 04:31:59.169222 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"131e52691f446d27a9a6d41517e95e12876a80120d0a01bf76cce155af739000\": container with ID starting with 131e52691f446d27a9a6d41517e95e12876a80120d0a01bf76cce155af739000 not found: ID does not exist" containerID="131e52691f446d27a9a6d41517e95e12876a80120d0a01bf76cce155af739000" Feb 28 04:31:59 crc kubenswrapper[5072]: I0228 04:31:59.169252 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"131e52691f446d27a9a6d41517e95e12876a80120d0a01bf76cce155af739000"} err="failed to get container status \"131e52691f446d27a9a6d41517e95e12876a80120d0a01bf76cce155af739000\": rpc error: code = NotFound desc = could not find container \"131e52691f446d27a9a6d41517e95e12876a80120d0a01bf76cce155af739000\": container with ID starting with 131e52691f446d27a9a6d41517e95e12876a80120d0a01bf76cce155af739000 not found: ID does not exist" Feb 28 04:31:59 crc kubenswrapper[5072]: I0228 04:31:59.173026 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-vqstk"] Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.135043 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537552-ms869"] Feb 28 04:32:00 crc kubenswrapper[5072]: E0228 04:32:00.135583 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c25f535-2cfb-40b6-9412-9888a0fc1975" containerName="galera" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.135596 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c25f535-2cfb-40b6-9412-9888a0fc1975" containerName="galera" Feb 28 04:32:00 crc kubenswrapper[5072]: E0228 04:32:00.135611 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aea9870e-2b10-4d66-b478-023e2aed2ced" containerName="registry-server" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.135619 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="aea9870e-2b10-4d66-b478-023e2aed2ced" containerName="registry-server" Feb 28 04:32:00 crc kubenswrapper[5072]: E0228 04:32:00.135632 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3f522d3-98e9-446c-985e-01fbeb36f25d" containerName="manager" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.136870 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3f522d3-98e9-446c-985e-01fbeb36f25d" containerName="manager" Feb 28 04:32:00 crc kubenswrapper[5072]: E0228 04:32:00.136884 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ac4fd8-0666-458f-8910-26abc36f0bdd" containerName="mariadb-account-delete" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.136897 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ac4fd8-0666-458f-8910-26abc36f0bdd" containerName="mariadb-account-delete" Feb 28 04:32:00 crc kubenswrapper[5072]: E0228 04:32:00.136908 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="373e4c12-ee6c-4f89-b684-fb8e61d18c9f" containerName="operator" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.136914 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="373e4c12-ee6c-4f89-b684-fb8e61d18c9f" containerName="operator" Feb 28 04:32:00 crc kubenswrapper[5072]: E0228 04:32:00.136926 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6325e48f-129d-4832-99d8-1cd8088708c3" containerName="manager" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.136933 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="6325e48f-129d-4832-99d8-1cd8088708c3" containerName="manager" Feb 28 04:32:00 crc kubenswrapper[5072]: E0228 04:32:00.136944 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c25f535-2cfb-40b6-9412-9888a0fc1975" containerName="mysql-bootstrap" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.136952 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c25f535-2cfb-40b6-9412-9888a0fc1975" containerName="mysql-bootstrap" Feb 28 04:32:00 crc kubenswrapper[5072]: E0228 04:32:00.136964 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e56491ab-6d17-4127-a25b-75b5e900e0aa" containerName="galera" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.136971 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="e56491ab-6d17-4127-a25b-75b5e900e0aa" containerName="galera" Feb 28 04:32:00 crc kubenswrapper[5072]: E0228 04:32:00.136984 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ac4fd8-0666-458f-8910-26abc36f0bdd" containerName="mariadb-account-delete" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.136991 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ac4fd8-0666-458f-8910-26abc36f0bdd" containerName="mariadb-account-delete" Feb 28 04:32:00 crc kubenswrapper[5072]: E0228 04:32:00.137000 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c71c158a-9876-4f8e-9100-7c0a36834415" containerName="rabbitmq" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137006 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="c71c158a-9876-4f8e-9100-7c0a36834415" containerName="rabbitmq" Feb 28 04:32:00 crc kubenswrapper[5072]: E0228 04:32:00.137015 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7ec1561-2733-469f-b4b4-13035f2557f0" containerName="registry-server" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137024 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ec1561-2733-469f-b4b4-13035f2557f0" containerName="registry-server" Feb 28 04:32:00 crc kubenswrapper[5072]: E0228 04:32:00.137035 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23fe0761-ad11-4ccf-9511-2c074bed0915" containerName="manager" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137042 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="23fe0761-ad11-4ccf-9511-2c074bed0915" containerName="manager" Feb 28 04:32:00 crc kubenswrapper[5072]: E0228 04:32:00.137054 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fab191f5-56a2-4b06-88be-14286e763b52" containerName="keystone-api" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137062 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="fab191f5-56a2-4b06-88be-14286e763b52" containerName="keystone-api" Feb 28 04:32:00 crc kubenswrapper[5072]: E0228 04:32:00.137075 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e56491ab-6d17-4127-a25b-75b5e900e0aa" containerName="mysql-bootstrap" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137083 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="e56491ab-6d17-4127-a25b-75b5e900e0aa" containerName="mysql-bootstrap" Feb 28 04:32:00 crc kubenswrapper[5072]: E0228 04:32:00.137095 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8171cc83-a178-4d19-b1c5-0d93b123838c" containerName="galera" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137102 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="8171cc83-a178-4d19-b1c5-0d93b123838c" containerName="galera" Feb 28 04:32:00 crc kubenswrapper[5072]: E0228 04:32:00.137110 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9ce90ac-7ea8-44b6-bfae-05f51789c804" containerName="registry-server" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137115 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9ce90ac-7ea8-44b6-bfae-05f51789c804" containerName="registry-server" Feb 28 04:32:00 crc kubenswrapper[5072]: E0228 04:32:00.137124 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c35bd0a7-4cba-4185-a45c-bfaf82c04638" containerName="registry-server" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137133 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="c35bd0a7-4cba-4185-a45c-bfaf82c04638" containerName="registry-server" Feb 28 04:32:00 crc kubenswrapper[5072]: E0228 04:32:00.137144 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8171cc83-a178-4d19-b1c5-0d93b123838c" containerName="mysql-bootstrap" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137151 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="8171cc83-a178-4d19-b1c5-0d93b123838c" containerName="mysql-bootstrap" Feb 28 04:32:00 crc kubenswrapper[5072]: E0228 04:32:00.137162 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c71c158a-9876-4f8e-9100-7c0a36834415" containerName="setup-container" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137169 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="c71c158a-9876-4f8e-9100-7c0a36834415" containerName="setup-container" Feb 28 04:32:00 crc kubenswrapper[5072]: E0228 04:32:00.137178 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba36008b-f798-4c99-bb4a-684f98897de8" containerName="manager" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137184 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba36008b-f798-4c99-bb4a-684f98897de8" containerName="manager" Feb 28 04:32:00 crc kubenswrapper[5072]: E0228 04:32:00.137193 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb49b1ff-eed4-41e8-a6e2-5b1514499d41" containerName="registry-server" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137200 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb49b1ff-eed4-41e8-a6e2-5b1514499d41" containerName="registry-server" Feb 28 04:32:00 crc kubenswrapper[5072]: E0228 04:32:00.137208 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2da690cd-386a-45cf-89a9-4d5a02218af4" containerName="memcached" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137215 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="2da690cd-386a-45cf-89a9-4d5a02218af4" containerName="memcached" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137319 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="6325e48f-129d-4832-99d8-1cd8088708c3" containerName="manager" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137331 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="fab191f5-56a2-4b06-88be-14286e763b52" containerName="keystone-api" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137340 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="2da690cd-386a-45cf-89a9-4d5a02218af4" containerName="memcached" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137345 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="c35bd0a7-4cba-4185-a45c-bfaf82c04638" containerName="registry-server" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137353 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="23fe0761-ad11-4ccf-9511-2c074bed0915" containerName="manager" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137361 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c25f535-2cfb-40b6-9412-9888a0fc1975" containerName="galera" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137368 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9ce90ac-7ea8-44b6-bfae-05f51789c804" containerName="registry-server" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137376 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba36008b-f798-4c99-bb4a-684f98897de8" containerName="manager" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137384 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="8171cc83-a178-4d19-b1c5-0d93b123838c" containerName="galera" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137390 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7ec1561-2733-469f-b4b4-13035f2557f0" containerName="registry-server" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137412 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7ac4fd8-0666-458f-8910-26abc36f0bdd" containerName="mariadb-account-delete" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137421 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb49b1ff-eed4-41e8-a6e2-5b1514499d41" containerName="registry-server" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137432 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="373e4c12-ee6c-4f89-b684-fb8e61d18c9f" containerName="operator" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137442 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="e56491ab-6d17-4127-a25b-75b5e900e0aa" containerName="galera" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137460 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="aea9870e-2b10-4d66-b478-023e2aed2ced" containerName="registry-server" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137470 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7ac4fd8-0666-458f-8910-26abc36f0bdd" containerName="mariadb-account-delete" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137481 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3f522d3-98e9-446c-985e-01fbeb36f25d" containerName="manager" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137489 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="c71c158a-9876-4f8e-9100-7c0a36834415" containerName="rabbitmq" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.137917 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537552-ms869" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.142183 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.142333 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.142489 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-c8kvx" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.147967 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6gtb\" (UniqueName: \"kubernetes.io/projected/28e1152f-ab56-472e-b5b2-d859aee63a9c-kube-api-access-c6gtb\") pod \"auto-csr-approver-29537552-ms869\" (UID: \"28e1152f-ab56-472e-b5b2-d859aee63a9c\") " pod="openshift-infra/auto-csr-approver-29537552-ms869" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.149793 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537552-ms869"] Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.249314 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6gtb\" (UniqueName: \"kubernetes.io/projected/28e1152f-ab56-472e-b5b2-d859aee63a9c-kube-api-access-c6gtb\") pod \"auto-csr-approver-29537552-ms869\" (UID: \"28e1152f-ab56-472e-b5b2-d859aee63a9c\") " pod="openshift-infra/auto-csr-approver-29537552-ms869" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.275517 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6gtb\" (UniqueName: \"kubernetes.io/projected/28e1152f-ab56-472e-b5b2-d859aee63a9c-kube-api-access-c6gtb\") pod \"auto-csr-approver-29537552-ms869\" (UID: \"28e1152f-ab56-472e-b5b2-d859aee63a9c\") " pod="openshift-infra/auto-csr-approver-29537552-ms869" Feb 28 04:32:00 crc kubenswrapper[5072]: I0228 04:32:00.458757 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537552-ms869" Feb 28 04:32:01 crc kubenswrapper[5072]: I0228 04:32:00.667780 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aea9870e-2b10-4d66-b478-023e2aed2ced" path="/var/lib/kubelet/pods/aea9870e-2b10-4d66-b478-023e2aed2ced/volumes" Feb 28 04:32:01 crc kubenswrapper[5072]: I0228 04:32:00.669329 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3f522d3-98e9-446c-985e-01fbeb36f25d" path="/var/lib/kubelet/pods/f3f522d3-98e9-446c-985e-01fbeb36f25d/volumes" Feb 28 04:32:01 crc kubenswrapper[5072]: I0228 04:32:00.869213 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537552-ms869"] Feb 28 04:32:01 crc kubenswrapper[5072]: W0228 04:32:00.879892 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28e1152f_ab56_472e_b5b2_d859aee63a9c.slice/crio-514b21b2907d9887f21c2ca9ed5808d4ab4feab2571eb40b42532abc3cbaf357 WatchSource:0}: Error finding container 514b21b2907d9887f21c2ca9ed5808d4ab4feab2571eb40b42532abc3cbaf357: Status 404 returned error can't find the container with id 514b21b2907d9887f21c2ca9ed5808d4ab4feab2571eb40b42532abc3cbaf357 Feb 28 04:32:01 crc kubenswrapper[5072]: I0228 04:32:00.883551 5072 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 28 04:32:01 crc kubenswrapper[5072]: I0228 04:32:01.159007 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537552-ms869" event={"ID":"28e1152f-ab56-472e-b5b2-d859aee63a9c","Type":"ContainerStarted","Data":"514b21b2907d9887f21c2ca9ed5808d4ab4feab2571eb40b42532abc3cbaf357"} Feb 28 04:32:02 crc kubenswrapper[5072]: I0228 04:32:02.165564 5072 generic.go:334] "Generic (PLEG): container finished" podID="28e1152f-ab56-472e-b5b2-d859aee63a9c" containerID="dc88d0c53537db4a17fc1d8e453cb3b7175c49858db7696b95234399a7f86d10" exitCode=0 Feb 28 04:32:02 crc kubenswrapper[5072]: I0228 04:32:02.165630 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537552-ms869" event={"ID":"28e1152f-ab56-472e-b5b2-d859aee63a9c","Type":"ContainerDied","Data":"dc88d0c53537db4a17fc1d8e453cb3b7175c49858db7696b95234399a7f86d10"} Feb 28 04:32:03 crc kubenswrapper[5072]: I0228 04:32:03.445694 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537552-ms869" Feb 28 04:32:03 crc kubenswrapper[5072]: I0228 04:32:03.496510 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6gtb\" (UniqueName: \"kubernetes.io/projected/28e1152f-ab56-472e-b5b2-d859aee63a9c-kube-api-access-c6gtb\") pod \"28e1152f-ab56-472e-b5b2-d859aee63a9c\" (UID: \"28e1152f-ab56-472e-b5b2-d859aee63a9c\") " Feb 28 04:32:03 crc kubenswrapper[5072]: I0228 04:32:03.501140 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28e1152f-ab56-472e-b5b2-d859aee63a9c-kube-api-access-c6gtb" (OuterVolumeSpecName: "kube-api-access-c6gtb") pod "28e1152f-ab56-472e-b5b2-d859aee63a9c" (UID: "28e1152f-ab56-472e-b5b2-d859aee63a9c"). InnerVolumeSpecName "kube-api-access-c6gtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:32:03 crc kubenswrapper[5072]: I0228 04:32:03.598217 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6gtb\" (UniqueName: \"kubernetes.io/projected/28e1152f-ab56-472e-b5b2-d859aee63a9c-kube-api-access-c6gtb\") on node \"crc\" DevicePath \"\"" Feb 28 04:32:04 crc kubenswrapper[5072]: I0228 04:32:04.179619 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537552-ms869" event={"ID":"28e1152f-ab56-472e-b5b2-d859aee63a9c","Type":"ContainerDied","Data":"514b21b2907d9887f21c2ca9ed5808d4ab4feab2571eb40b42532abc3cbaf357"} Feb 28 04:32:04 crc kubenswrapper[5072]: I0228 04:32:04.179718 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="514b21b2907d9887f21c2ca9ed5808d4ab4feab2571eb40b42532abc3cbaf357" Feb 28 04:32:04 crc kubenswrapper[5072]: I0228 04:32:04.179739 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537552-ms869" Feb 28 04:32:04 crc kubenswrapper[5072]: I0228 04:32:04.506530 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537546-ztbp6"] Feb 28 04:32:04 crc kubenswrapper[5072]: I0228 04:32:04.510168 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537546-ztbp6"] Feb 28 04:32:04 crc kubenswrapper[5072]: I0228 04:32:04.666730 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bb6cfd5-00f1-4028-b63e-96effbd865f0" path="/var/lib/kubelet/pods/1bb6cfd5-00f1-4028-b63e-96effbd865f0/volumes" Feb 28 04:32:10 crc kubenswrapper[5072]: I0228 04:32:10.950501 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4p64t/must-gather-lj9ph"] Feb 28 04:32:10 crc kubenswrapper[5072]: E0228 04:32:10.951322 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28e1152f-ab56-472e-b5b2-d859aee63a9c" containerName="oc" Feb 28 04:32:10 crc kubenswrapper[5072]: I0228 04:32:10.951336 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="28e1152f-ab56-472e-b5b2-d859aee63a9c" containerName="oc" Feb 28 04:32:10 crc kubenswrapper[5072]: I0228 04:32:10.951439 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="28e1152f-ab56-472e-b5b2-d859aee63a9c" containerName="oc" Feb 28 04:32:10 crc kubenswrapper[5072]: I0228 04:32:10.952232 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4p64t/must-gather-lj9ph" Feb 28 04:32:10 crc kubenswrapper[5072]: I0228 04:32:10.956181 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4p64t"/"kube-root-ca.crt" Feb 28 04:32:10 crc kubenswrapper[5072]: I0228 04:32:10.956860 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4p64t"/"openshift-service-ca.crt" Feb 28 04:32:10 crc kubenswrapper[5072]: I0228 04:32:10.962552 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-4p64t"/"default-dockercfg-whr2d" Feb 28 04:32:10 crc kubenswrapper[5072]: I0228 04:32:10.976089 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4p64t/must-gather-lj9ph"] Feb 28 04:32:11 crc kubenswrapper[5072]: I0228 04:32:11.003550 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvfjg\" (UniqueName: \"kubernetes.io/projected/22a7d4db-bde8-474f-8ba6-ff8332b7127f-kube-api-access-nvfjg\") pod \"must-gather-lj9ph\" (UID: \"22a7d4db-bde8-474f-8ba6-ff8332b7127f\") " pod="openshift-must-gather-4p64t/must-gather-lj9ph" Feb 28 04:32:11 crc kubenswrapper[5072]: I0228 04:32:11.003839 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/22a7d4db-bde8-474f-8ba6-ff8332b7127f-must-gather-output\") pod \"must-gather-lj9ph\" (UID: \"22a7d4db-bde8-474f-8ba6-ff8332b7127f\") " pod="openshift-must-gather-4p64t/must-gather-lj9ph" Feb 28 04:32:11 crc kubenswrapper[5072]: I0228 04:32:11.105363 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/22a7d4db-bde8-474f-8ba6-ff8332b7127f-must-gather-output\") pod \"must-gather-lj9ph\" (UID: \"22a7d4db-bde8-474f-8ba6-ff8332b7127f\") " pod="openshift-must-gather-4p64t/must-gather-lj9ph" Feb 28 04:32:11 crc kubenswrapper[5072]: I0228 04:32:11.105442 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvfjg\" (UniqueName: \"kubernetes.io/projected/22a7d4db-bde8-474f-8ba6-ff8332b7127f-kube-api-access-nvfjg\") pod \"must-gather-lj9ph\" (UID: \"22a7d4db-bde8-474f-8ba6-ff8332b7127f\") " pod="openshift-must-gather-4p64t/must-gather-lj9ph" Feb 28 04:32:11 crc kubenswrapper[5072]: I0228 04:32:11.106589 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/22a7d4db-bde8-474f-8ba6-ff8332b7127f-must-gather-output\") pod \"must-gather-lj9ph\" (UID: \"22a7d4db-bde8-474f-8ba6-ff8332b7127f\") " pod="openshift-must-gather-4p64t/must-gather-lj9ph" Feb 28 04:32:11 crc kubenswrapper[5072]: I0228 04:32:11.125449 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvfjg\" (UniqueName: \"kubernetes.io/projected/22a7d4db-bde8-474f-8ba6-ff8332b7127f-kube-api-access-nvfjg\") pod \"must-gather-lj9ph\" (UID: \"22a7d4db-bde8-474f-8ba6-ff8332b7127f\") " pod="openshift-must-gather-4p64t/must-gather-lj9ph" Feb 28 04:32:11 crc kubenswrapper[5072]: I0228 04:32:11.265464 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4p64t/must-gather-lj9ph" Feb 28 04:32:11 crc kubenswrapper[5072]: I0228 04:32:11.677029 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4p64t/must-gather-lj9ph"] Feb 28 04:32:12 crc kubenswrapper[5072]: I0228 04:32:12.235926 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4p64t/must-gather-lj9ph" event={"ID":"22a7d4db-bde8-474f-8ba6-ff8332b7127f","Type":"ContainerStarted","Data":"4494397af5df624677cf8538851e4991acacc09d84e3e0242e270436c6483693"} Feb 28 04:32:17 crc kubenswrapper[5072]: I0228 04:32:17.271923 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4p64t/must-gather-lj9ph" event={"ID":"22a7d4db-bde8-474f-8ba6-ff8332b7127f","Type":"ContainerStarted","Data":"0a1a3e19e801de01e9cc2ae55385f65461b7e49998f0a9c462404b67c2d3b233"} Feb 28 04:32:17 crc kubenswrapper[5072]: I0228 04:32:17.272508 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4p64t/must-gather-lj9ph" event={"ID":"22a7d4db-bde8-474f-8ba6-ff8332b7127f","Type":"ContainerStarted","Data":"efa5c887127247a93ccfabd4a42077c35c4eaf7cb54061d2962f201d57713d9a"} Feb 28 04:32:17 crc kubenswrapper[5072]: I0228 04:32:17.291279 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4p64t/must-gather-lj9ph" podStartSLOduration=2.457580896 podStartE2EDuration="7.291250185s" podCreationTimestamp="2026-02-28 04:32:10 +0000 UTC" firstStartedPulling="2026-02-28 04:32:11.691570947 +0000 UTC m=+1353.686301149" lastFinishedPulling="2026-02-28 04:32:16.525240246 +0000 UTC m=+1358.519970438" observedRunningTime="2026-02-28 04:32:17.284875254 +0000 UTC m=+1359.279605456" watchObservedRunningTime="2026-02-28 04:32:17.291250185 +0000 UTC m=+1359.285980387" Feb 28 04:32:20 crc kubenswrapper[5072]: I0228 04:32:20.105598 5072 patch_prober.go:28] interesting pod/machine-config-daemon-5lrpf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:32:20 crc kubenswrapper[5072]: I0228 04:32:20.105683 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:32:20 crc kubenswrapper[5072]: I0228 04:32:20.589779 5072 scope.go:117] "RemoveContainer" containerID="dfe9c897421e35a94eeef77a6f2ff37f92cf75e56fd8cb883528baa35adef364" Feb 28 04:32:20 crc kubenswrapper[5072]: I0228 04:32:20.621111 5072 scope.go:117] "RemoveContainer" containerID="653b0d24fa4ea520daca0f2d7978213ecb3dc0d0059891612c42ddb47c704811" Feb 28 04:32:20 crc kubenswrapper[5072]: I0228 04:32:20.645378 5072 scope.go:117] "RemoveContainer" containerID="d5241d62112b26e63145d089dcdd4fcfaf4bbc5a583c5ae7158f1de89c74d6c0" Feb 28 04:32:20 crc kubenswrapper[5072]: I0228 04:32:20.661081 5072 scope.go:117] "RemoveContainer" containerID="809343c9ee42f6592affc2b8749731fd47732d8e9729c316750331a32d6adba9" Feb 28 04:32:20 crc kubenswrapper[5072]: I0228 04:32:20.694245 5072 scope.go:117] "RemoveContainer" containerID="b83bd763adc7d84d3c0b3a25ef9c56636923f98d8c37e9276914373942cd8520" Feb 28 04:32:20 crc kubenswrapper[5072]: I0228 04:32:20.711534 5072 scope.go:117] "RemoveContainer" containerID="e31182fb92a2f9129ff8de4bee9c0da29c7b6970056ab9089b995d03577aa6b3" Feb 28 04:32:20 crc kubenswrapper[5072]: I0228 04:32:20.751999 5072 scope.go:117] "RemoveContainer" containerID="35888c4b552ad7c4d2ba2dea0a3ad2136b741607e5dfc7779234fafc29e8a90a" Feb 28 04:32:50 crc kubenswrapper[5072]: I0228 04:32:50.105544 5072 patch_prober.go:28] interesting pod/machine-config-daemon-5lrpf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:32:50 crc kubenswrapper[5072]: I0228 04:32:50.106020 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:33:02 crc kubenswrapper[5072]: I0228 04:33:02.166590 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-zc5mk_672db961-8de6-46ec-9dd8-5d2ef7572eef/control-plane-machine-set-operator/0.log" Feb 28 04:33:02 crc kubenswrapper[5072]: I0228 04:33:02.354060 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9vfgz_707bbe1d-eb9e-4d9d-8e70-e88429b8c077/machine-api-operator/0.log" Feb 28 04:33:02 crc kubenswrapper[5072]: I0228 04:33:02.376152 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9vfgz_707bbe1d-eb9e-4d9d-8e70-e88429b8c077/kube-rbac-proxy/0.log" Feb 28 04:33:20 crc kubenswrapper[5072]: I0228 04:33:20.106124 5072 patch_prober.go:28] interesting pod/machine-config-daemon-5lrpf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:33:20 crc kubenswrapper[5072]: I0228 04:33:20.106932 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:33:20 crc kubenswrapper[5072]: I0228 04:33:20.106977 5072 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" Feb 28 04:33:20 crc kubenswrapper[5072]: I0228 04:33:20.107629 5072 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"27d2fb4f87a04571b7b0a9792f832a0142945d828b2f05c5af46a4307532ae67"} pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 28 04:33:20 crc kubenswrapper[5072]: I0228 04:33:20.107705 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" containerID="cri-o://27d2fb4f87a04571b7b0a9792f832a0142945d828b2f05c5af46a4307532ae67" gracePeriod=600 Feb 28 04:33:20 crc kubenswrapper[5072]: I0228 04:33:20.643590 5072 generic.go:334] "Generic (PLEG): container finished" podID="a035bbab-1d8f-4120-aaf7-88984d936939" containerID="27d2fb4f87a04571b7b0a9792f832a0142945d828b2f05c5af46a4307532ae67" exitCode=0 Feb 28 04:33:20 crc kubenswrapper[5072]: I0228 04:33:20.643684 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" event={"ID":"a035bbab-1d8f-4120-aaf7-88984d936939","Type":"ContainerDied","Data":"27d2fb4f87a04571b7b0a9792f832a0142945d828b2f05c5af46a4307532ae67"} Feb 28 04:33:20 crc kubenswrapper[5072]: I0228 04:33:20.643947 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" event={"ID":"a035bbab-1d8f-4120-aaf7-88984d936939","Type":"ContainerStarted","Data":"0e386676382b0c8efe27f89db4a4d5fa8b4258c234eaa3b9c041a33c00eda272"} Feb 28 04:33:20 crc kubenswrapper[5072]: I0228 04:33:20.643981 5072 scope.go:117] "RemoveContainer" containerID="12b4b3f484e46c0cfc12fc90f1da58cb1b716b35bd291d441c02d1fe8abc9e04" Feb 28 04:33:21 crc kubenswrapper[5072]: I0228 04:33:21.013769 5072 scope.go:117] "RemoveContainer" containerID="1f526d6333b068a7c3555bfc1959efaf818ea2668e28d86fd37e3e1bd8a6abb0" Feb 28 04:33:21 crc kubenswrapper[5072]: I0228 04:33:21.030230 5072 scope.go:117] "RemoveContainer" containerID="97375b69562f960915f9036bf18eae4be0c446cda5fc0681419c1666f09be26a" Feb 28 04:33:21 crc kubenswrapper[5072]: I0228 04:33:21.072923 5072 scope.go:117] "RemoveContainer" containerID="016c30a85ac29fe47d215edf0d46c749f707c35ed18e7c3c68a3b8c3448b15d1" Feb 28 04:33:21 crc kubenswrapper[5072]: I0228 04:33:21.089416 5072 scope.go:117] "RemoveContainer" containerID="33ccc774ab28074e17fa9ba2a1ebd3191889cd797043e7b717faefdce0f44596" Feb 28 04:33:29 crc kubenswrapper[5072]: I0228 04:33:29.668404 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-wf6c2_81c4d0e9-644c-4f99-af4b-0d73be068ca2/kube-rbac-proxy/0.log" Feb 28 04:33:29 crc kubenswrapper[5072]: I0228 04:33:29.669618 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-wf6c2_81c4d0e9-644c-4f99-af4b-0d73be068ca2/controller/0.log" Feb 28 04:33:29 crc kubenswrapper[5072]: I0228 04:33:29.798567 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/cp-frr-files/0.log" Feb 28 04:33:30 crc kubenswrapper[5072]: I0228 04:33:30.015929 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/cp-reloader/0.log" Feb 28 04:33:30 crc kubenswrapper[5072]: I0228 04:33:30.022875 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/cp-reloader/0.log" Feb 28 04:33:30 crc kubenswrapper[5072]: I0228 04:33:30.040342 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/cp-frr-files/0.log" Feb 28 04:33:30 crc kubenswrapper[5072]: I0228 04:33:30.062057 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/cp-metrics/0.log" Feb 28 04:33:30 crc kubenswrapper[5072]: I0228 04:33:30.233625 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/cp-reloader/0.log" Feb 28 04:33:30 crc kubenswrapper[5072]: I0228 04:33:30.250833 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/cp-metrics/0.log" Feb 28 04:33:30 crc kubenswrapper[5072]: I0228 04:33:30.253857 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/cp-frr-files/0.log" Feb 28 04:33:30 crc kubenswrapper[5072]: I0228 04:33:30.282527 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/cp-metrics/0.log" Feb 28 04:33:30 crc kubenswrapper[5072]: I0228 04:33:30.475120 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/cp-metrics/0.log" Feb 28 04:33:30 crc kubenswrapper[5072]: I0228 04:33:30.512293 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/cp-reloader/0.log" Feb 28 04:33:30 crc kubenswrapper[5072]: I0228 04:33:30.513044 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/cp-frr-files/0.log" Feb 28 04:33:30 crc kubenswrapper[5072]: I0228 04:33:30.515759 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/controller/0.log" Feb 28 04:33:30 crc kubenswrapper[5072]: I0228 04:33:30.725782 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/kube-rbac-proxy-frr/0.log" Feb 28 04:33:30 crc kubenswrapper[5072]: I0228 04:33:30.725847 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/kube-rbac-proxy/0.log" Feb 28 04:33:30 crc kubenswrapper[5072]: I0228 04:33:30.737252 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/frr-metrics/0.log" Feb 28 04:33:30 crc kubenswrapper[5072]: I0228 04:33:30.906106 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/reloader/0.log" Feb 28 04:33:30 crc kubenswrapper[5072]: I0228 04:33:30.932726 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/frr/0.log" Feb 28 04:33:30 crc kubenswrapper[5072]: I0228 04:33:30.935166 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-6mp9v_9783d250-c2b9-4e29-a8d7-94d92d301478/frr-k8s-webhook-server/0.log" Feb 28 04:33:31 crc kubenswrapper[5072]: I0228 04:33:31.217768 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-85768d6f57-5rpmg_66660768-8bc9-40af-baab-529d0820c10b/manager/0.log" Feb 28 04:33:31 crc kubenswrapper[5072]: I0228 04:33:31.254234 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6b95579fd-hmq5d_67512bfe-55b8-4df0-aa98-54225fc624a3/webhook-server/0.log" Feb 28 04:33:31 crc kubenswrapper[5072]: I0228 04:33:31.407297 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vlqxq_7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c/kube-rbac-proxy/0.log" Feb 28 04:33:31 crc kubenswrapper[5072]: I0228 04:33:31.483884 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vlqxq_7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c/speaker/0.log" Feb 28 04:33:54 crc kubenswrapper[5072]: I0228 04:33:54.244379 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xzs42_16857e88-4eaa-40bb-86cb-04fd3da8babe/extract-utilities/0.log" Feb 28 04:33:54 crc kubenswrapper[5072]: I0228 04:33:54.425234 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xzs42_16857e88-4eaa-40bb-86cb-04fd3da8babe/extract-content/0.log" Feb 28 04:33:54 crc kubenswrapper[5072]: I0228 04:33:54.448715 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xzs42_16857e88-4eaa-40bb-86cb-04fd3da8babe/extract-utilities/0.log" Feb 28 04:33:54 crc kubenswrapper[5072]: I0228 04:33:54.495457 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xzs42_16857e88-4eaa-40bb-86cb-04fd3da8babe/extract-content/0.log" Feb 28 04:33:54 crc kubenswrapper[5072]: I0228 04:33:54.602459 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xzs42_16857e88-4eaa-40bb-86cb-04fd3da8babe/extract-utilities/0.log" Feb 28 04:33:54 crc kubenswrapper[5072]: I0228 04:33:54.606505 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xzs42_16857e88-4eaa-40bb-86cb-04fd3da8babe/extract-content/0.log" Feb 28 04:33:54 crc kubenswrapper[5072]: I0228 04:33:54.841462 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6qf7q_6dbe0794-9375-4056-98bf-7ae9f9f10093/extract-utilities/0.log" Feb 28 04:33:54 crc kubenswrapper[5072]: I0228 04:33:54.954475 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xzs42_16857e88-4eaa-40bb-86cb-04fd3da8babe/registry-server/0.log" Feb 28 04:33:54 crc kubenswrapper[5072]: I0228 04:33:54.971274 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6qf7q_6dbe0794-9375-4056-98bf-7ae9f9f10093/extract-content/0.log" Feb 28 04:33:55 crc kubenswrapper[5072]: I0228 04:33:55.029302 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6qf7q_6dbe0794-9375-4056-98bf-7ae9f9f10093/extract-utilities/0.log" Feb 28 04:33:55 crc kubenswrapper[5072]: I0228 04:33:55.073129 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6qf7q_6dbe0794-9375-4056-98bf-7ae9f9f10093/extract-content/0.log" Feb 28 04:33:55 crc kubenswrapper[5072]: I0228 04:33:55.233492 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6qf7q_6dbe0794-9375-4056-98bf-7ae9f9f10093/extract-utilities/0.log" Feb 28 04:33:55 crc kubenswrapper[5072]: I0228 04:33:55.289822 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6qf7q_6dbe0794-9375-4056-98bf-7ae9f9f10093/extract-content/0.log" Feb 28 04:33:55 crc kubenswrapper[5072]: I0228 04:33:55.418303 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_c8b2c327-09b5-479e-b2e9-8edb01862f59/util/0.log" Feb 28 04:33:55 crc kubenswrapper[5072]: I0228 04:33:55.467352 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6qf7q_6dbe0794-9375-4056-98bf-7ae9f9f10093/registry-server/0.log" Feb 28 04:33:55 crc kubenswrapper[5072]: I0228 04:33:55.600098 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_c8b2c327-09b5-479e-b2e9-8edb01862f59/pull/0.log" Feb 28 04:33:55 crc kubenswrapper[5072]: I0228 04:33:55.608666 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_c8b2c327-09b5-479e-b2e9-8edb01862f59/pull/0.log" Feb 28 04:33:55 crc kubenswrapper[5072]: I0228 04:33:55.787884 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_c8b2c327-09b5-479e-b2e9-8edb01862f59/util/0.log" Feb 28 04:33:56 crc kubenswrapper[5072]: I0228 04:33:56.068872 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_c8b2c327-09b5-479e-b2e9-8edb01862f59/util/0.log" Feb 28 04:33:56 crc kubenswrapper[5072]: I0228 04:33:56.069867 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_c8b2c327-09b5-479e-b2e9-8edb01862f59/extract/0.log" Feb 28 04:33:56 crc kubenswrapper[5072]: I0228 04:33:56.075842 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_c8b2c327-09b5-479e-b2e9-8edb01862f59/pull/0.log" Feb 28 04:33:56 crc kubenswrapper[5072]: I0228 04:33:56.336870 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-jxjmq_cf5cb269-db5d-4b8d-ba70-a583c95dd586/marketplace-operator/0.log" Feb 28 04:33:56 crc kubenswrapper[5072]: I0228 04:33:56.439472 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vrkw_df13e8d9-b1b8-4ed6-b16a-543fe5b71d46/extract-utilities/0.log" Feb 28 04:33:56 crc kubenswrapper[5072]: I0228 04:33:56.773432 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vrkw_df13e8d9-b1b8-4ed6-b16a-543fe5b71d46/extract-content/0.log" Feb 28 04:33:56 crc kubenswrapper[5072]: I0228 04:33:56.782472 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vrkw_df13e8d9-b1b8-4ed6-b16a-543fe5b71d46/extract-utilities/0.log" Feb 28 04:33:56 crc kubenswrapper[5072]: I0228 04:33:56.782690 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vrkw_df13e8d9-b1b8-4ed6-b16a-543fe5b71d46/extract-content/0.log" Feb 28 04:33:56 crc kubenswrapper[5072]: I0228 04:33:56.956174 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vrkw_df13e8d9-b1b8-4ed6-b16a-543fe5b71d46/extract-utilities/0.log" Feb 28 04:33:57 crc kubenswrapper[5072]: I0228 04:33:57.016238 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vrkw_df13e8d9-b1b8-4ed6-b16a-543fe5b71d46/extract-content/0.log" Feb 28 04:33:57 crc kubenswrapper[5072]: I0228 04:33:57.043891 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vrkw_df13e8d9-b1b8-4ed6-b16a-543fe5b71d46/registry-server/0.log" Feb 28 04:33:57 crc kubenswrapper[5072]: I0228 04:33:57.130744 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-64jqj_4fa349e7-fe1e-47f4-80bd-7d0e1bf55719/extract-utilities/0.log" Feb 28 04:33:57 crc kubenswrapper[5072]: I0228 04:33:57.315159 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-64jqj_4fa349e7-fe1e-47f4-80bd-7d0e1bf55719/extract-content/0.log" Feb 28 04:33:57 crc kubenswrapper[5072]: I0228 04:33:57.316977 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-64jqj_4fa349e7-fe1e-47f4-80bd-7d0e1bf55719/extract-content/0.log" Feb 28 04:33:57 crc kubenswrapper[5072]: I0228 04:33:57.345102 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-64jqj_4fa349e7-fe1e-47f4-80bd-7d0e1bf55719/extract-utilities/0.log" Feb 28 04:33:57 crc kubenswrapper[5072]: I0228 04:33:57.484420 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-64jqj_4fa349e7-fe1e-47f4-80bd-7d0e1bf55719/extract-content/0.log" Feb 28 04:33:57 crc kubenswrapper[5072]: I0228 04:33:57.497234 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-64jqj_4fa349e7-fe1e-47f4-80bd-7d0e1bf55719/extract-utilities/0.log" Feb 28 04:33:57 crc kubenswrapper[5072]: I0228 04:33:57.815565 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-64jqj_4fa349e7-fe1e-47f4-80bd-7d0e1bf55719/registry-server/0.log" Feb 28 04:34:00 crc kubenswrapper[5072]: I0228 04:34:00.158604 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537554-dnn6z"] Feb 28 04:34:00 crc kubenswrapper[5072]: I0228 04:34:00.159572 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537554-dnn6z" Feb 28 04:34:00 crc kubenswrapper[5072]: I0228 04:34:00.162747 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 04:34:00 crc kubenswrapper[5072]: I0228 04:34:00.162840 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 04:34:00 crc kubenswrapper[5072]: I0228 04:34:00.163363 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-c8kvx" Feb 28 04:34:00 crc kubenswrapper[5072]: I0228 04:34:00.176302 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537554-dnn6z"] Feb 28 04:34:00 crc kubenswrapper[5072]: I0228 04:34:00.339237 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st4nh\" (UniqueName: \"kubernetes.io/projected/d76583e4-1977-4ec3-a097-b0e22f3569dc-kube-api-access-st4nh\") pod \"auto-csr-approver-29537554-dnn6z\" (UID: \"d76583e4-1977-4ec3-a097-b0e22f3569dc\") " pod="openshift-infra/auto-csr-approver-29537554-dnn6z" Feb 28 04:34:00 crc kubenswrapper[5072]: I0228 04:34:00.440836 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st4nh\" (UniqueName: \"kubernetes.io/projected/d76583e4-1977-4ec3-a097-b0e22f3569dc-kube-api-access-st4nh\") pod \"auto-csr-approver-29537554-dnn6z\" (UID: \"d76583e4-1977-4ec3-a097-b0e22f3569dc\") " pod="openshift-infra/auto-csr-approver-29537554-dnn6z" Feb 28 04:34:00 crc kubenswrapper[5072]: I0228 04:34:00.463562 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st4nh\" (UniqueName: \"kubernetes.io/projected/d76583e4-1977-4ec3-a097-b0e22f3569dc-kube-api-access-st4nh\") pod \"auto-csr-approver-29537554-dnn6z\" (UID: \"d76583e4-1977-4ec3-a097-b0e22f3569dc\") " pod="openshift-infra/auto-csr-approver-29537554-dnn6z" Feb 28 04:34:00 crc kubenswrapper[5072]: I0228 04:34:00.515395 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537554-dnn6z" Feb 28 04:34:00 crc kubenswrapper[5072]: I0228 04:34:00.901394 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537554-dnn6z"] Feb 28 04:34:01 crc kubenswrapper[5072]: I0228 04:34:01.920850 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537554-dnn6z" event={"ID":"d76583e4-1977-4ec3-a097-b0e22f3569dc","Type":"ContainerStarted","Data":"c2dd03be1acd8b622cf1978111c648ddb2d23ff6ccaa916c0eb2bb9e13e4e5cb"} Feb 28 04:34:02 crc kubenswrapper[5072]: I0228 04:34:02.001923 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kcbwd"] Feb 28 04:34:02 crc kubenswrapper[5072]: I0228 04:34:02.003223 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kcbwd" Feb 28 04:34:02 crc kubenswrapper[5072]: I0228 04:34:02.017510 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kcbwd"] Feb 28 04:34:02 crc kubenswrapper[5072]: I0228 04:34:02.162573 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd97fece-aaaf-40eb-86b4-f0bbd198bf3f-utilities\") pod \"redhat-operators-kcbwd\" (UID: \"dd97fece-aaaf-40eb-86b4-f0bbd198bf3f\") " pod="openshift-marketplace/redhat-operators-kcbwd" Feb 28 04:34:02 crc kubenswrapper[5072]: I0228 04:34:02.162707 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd97fece-aaaf-40eb-86b4-f0bbd198bf3f-catalog-content\") pod \"redhat-operators-kcbwd\" (UID: \"dd97fece-aaaf-40eb-86b4-f0bbd198bf3f\") " pod="openshift-marketplace/redhat-operators-kcbwd" Feb 28 04:34:02 crc kubenswrapper[5072]: I0228 04:34:02.162756 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98vdv\" (UniqueName: \"kubernetes.io/projected/dd97fece-aaaf-40eb-86b4-f0bbd198bf3f-kube-api-access-98vdv\") pod \"redhat-operators-kcbwd\" (UID: \"dd97fece-aaaf-40eb-86b4-f0bbd198bf3f\") " pod="openshift-marketplace/redhat-operators-kcbwd" Feb 28 04:34:02 crc kubenswrapper[5072]: I0228 04:34:02.264346 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd97fece-aaaf-40eb-86b4-f0bbd198bf3f-catalog-content\") pod \"redhat-operators-kcbwd\" (UID: \"dd97fece-aaaf-40eb-86b4-f0bbd198bf3f\") " pod="openshift-marketplace/redhat-operators-kcbwd" Feb 28 04:34:02 crc kubenswrapper[5072]: I0228 04:34:02.264716 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98vdv\" (UniqueName: \"kubernetes.io/projected/dd97fece-aaaf-40eb-86b4-f0bbd198bf3f-kube-api-access-98vdv\") pod \"redhat-operators-kcbwd\" (UID: \"dd97fece-aaaf-40eb-86b4-f0bbd198bf3f\") " pod="openshift-marketplace/redhat-operators-kcbwd" Feb 28 04:34:02 crc kubenswrapper[5072]: I0228 04:34:02.264752 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd97fece-aaaf-40eb-86b4-f0bbd198bf3f-utilities\") pod \"redhat-operators-kcbwd\" (UID: \"dd97fece-aaaf-40eb-86b4-f0bbd198bf3f\") " pod="openshift-marketplace/redhat-operators-kcbwd" Feb 28 04:34:02 crc kubenswrapper[5072]: I0228 04:34:02.265415 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd97fece-aaaf-40eb-86b4-f0bbd198bf3f-utilities\") pod \"redhat-operators-kcbwd\" (UID: \"dd97fece-aaaf-40eb-86b4-f0bbd198bf3f\") " pod="openshift-marketplace/redhat-operators-kcbwd" Feb 28 04:34:02 crc kubenswrapper[5072]: I0228 04:34:02.265414 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd97fece-aaaf-40eb-86b4-f0bbd198bf3f-catalog-content\") pod \"redhat-operators-kcbwd\" (UID: \"dd97fece-aaaf-40eb-86b4-f0bbd198bf3f\") " pod="openshift-marketplace/redhat-operators-kcbwd" Feb 28 04:34:02 crc kubenswrapper[5072]: I0228 04:34:02.291390 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98vdv\" (UniqueName: \"kubernetes.io/projected/dd97fece-aaaf-40eb-86b4-f0bbd198bf3f-kube-api-access-98vdv\") pod \"redhat-operators-kcbwd\" (UID: \"dd97fece-aaaf-40eb-86b4-f0bbd198bf3f\") " pod="openshift-marketplace/redhat-operators-kcbwd" Feb 28 04:34:02 crc kubenswrapper[5072]: I0228 04:34:02.334234 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kcbwd" Feb 28 04:34:02 crc kubenswrapper[5072]: I0228 04:34:02.773607 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kcbwd"] Feb 28 04:34:02 crc kubenswrapper[5072]: W0228 04:34:02.781239 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd97fece_aaaf_40eb_86b4_f0bbd198bf3f.slice/crio-5415ab43d8e1cc6dcb3b727590ff3b19bb483e39bd47f6e949aad86132b69e1d WatchSource:0}: Error finding container 5415ab43d8e1cc6dcb3b727590ff3b19bb483e39bd47f6e949aad86132b69e1d: Status 404 returned error can't find the container with id 5415ab43d8e1cc6dcb3b727590ff3b19bb483e39bd47f6e949aad86132b69e1d Feb 28 04:34:02 crc kubenswrapper[5072]: I0228 04:34:02.930843 5072 generic.go:334] "Generic (PLEG): container finished" podID="d76583e4-1977-4ec3-a097-b0e22f3569dc" containerID="bb89b8819adc51354920ac27a8805b66feb8724a9eba2d0465da88f99456bab3" exitCode=0 Feb 28 04:34:02 crc kubenswrapper[5072]: I0228 04:34:02.930898 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537554-dnn6z" event={"ID":"d76583e4-1977-4ec3-a097-b0e22f3569dc","Type":"ContainerDied","Data":"bb89b8819adc51354920ac27a8805b66feb8724a9eba2d0465da88f99456bab3"} Feb 28 04:34:02 crc kubenswrapper[5072]: I0228 04:34:02.932401 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcbwd" event={"ID":"dd97fece-aaaf-40eb-86b4-f0bbd198bf3f","Type":"ContainerStarted","Data":"5415ab43d8e1cc6dcb3b727590ff3b19bb483e39bd47f6e949aad86132b69e1d"} Feb 28 04:34:03 crc kubenswrapper[5072]: I0228 04:34:03.941709 5072 generic.go:334] "Generic (PLEG): container finished" podID="dd97fece-aaaf-40eb-86b4-f0bbd198bf3f" containerID="61bfdc317d45f0ebc190db013b3e7a8742a0cbe34b6b7dbed929555f29468bb8" exitCode=0 Feb 28 04:34:03 crc kubenswrapper[5072]: I0228 04:34:03.941763 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcbwd" event={"ID":"dd97fece-aaaf-40eb-86b4-f0bbd198bf3f","Type":"ContainerDied","Data":"61bfdc317d45f0ebc190db013b3e7a8742a0cbe34b6b7dbed929555f29468bb8"} Feb 28 04:34:04 crc kubenswrapper[5072]: I0228 04:34:04.189580 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537554-dnn6z" Feb 28 04:34:04 crc kubenswrapper[5072]: I0228 04:34:04.389204 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-st4nh\" (UniqueName: \"kubernetes.io/projected/d76583e4-1977-4ec3-a097-b0e22f3569dc-kube-api-access-st4nh\") pod \"d76583e4-1977-4ec3-a097-b0e22f3569dc\" (UID: \"d76583e4-1977-4ec3-a097-b0e22f3569dc\") " Feb 28 04:34:04 crc kubenswrapper[5072]: I0228 04:34:04.402863 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d76583e4-1977-4ec3-a097-b0e22f3569dc-kube-api-access-st4nh" (OuterVolumeSpecName: "kube-api-access-st4nh") pod "d76583e4-1977-4ec3-a097-b0e22f3569dc" (UID: "d76583e4-1977-4ec3-a097-b0e22f3569dc"). InnerVolumeSpecName "kube-api-access-st4nh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:34:04 crc kubenswrapper[5072]: I0228 04:34:04.491167 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-st4nh\" (UniqueName: \"kubernetes.io/projected/d76583e4-1977-4ec3-a097-b0e22f3569dc-kube-api-access-st4nh\") on node \"crc\" DevicePath \"\"" Feb 28 04:34:04 crc kubenswrapper[5072]: I0228 04:34:04.949004 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537554-dnn6z" Feb 28 04:34:04 crc kubenswrapper[5072]: I0228 04:34:04.949015 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537554-dnn6z" event={"ID":"d76583e4-1977-4ec3-a097-b0e22f3569dc","Type":"ContainerDied","Data":"c2dd03be1acd8b622cf1978111c648ddb2d23ff6ccaa916c0eb2bb9e13e4e5cb"} Feb 28 04:34:04 crc kubenswrapper[5072]: I0228 04:34:04.949067 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2dd03be1acd8b622cf1978111c648ddb2d23ff6ccaa916c0eb2bb9e13e4e5cb" Feb 28 04:34:04 crc kubenswrapper[5072]: I0228 04:34:04.950899 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcbwd" event={"ID":"dd97fece-aaaf-40eb-86b4-f0bbd198bf3f","Type":"ContainerStarted","Data":"c54e908260aea9a6478f66312dcde51790bdcfc9a85839569b455088eebf4221"} Feb 28 04:34:05 crc kubenswrapper[5072]: I0228 04:34:05.251108 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537548-jhcpw"] Feb 28 04:34:05 crc kubenswrapper[5072]: I0228 04:34:05.256842 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537548-jhcpw"] Feb 28 04:34:05 crc kubenswrapper[5072]: I0228 04:34:05.959710 5072 generic.go:334] "Generic (PLEG): container finished" podID="dd97fece-aaaf-40eb-86b4-f0bbd198bf3f" containerID="c54e908260aea9a6478f66312dcde51790bdcfc9a85839569b455088eebf4221" exitCode=0 Feb 28 04:34:05 crc kubenswrapper[5072]: I0228 04:34:05.959748 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcbwd" event={"ID":"dd97fece-aaaf-40eb-86b4-f0bbd198bf3f","Type":"ContainerDied","Data":"c54e908260aea9a6478f66312dcde51790bdcfc9a85839569b455088eebf4221"} Feb 28 04:34:06 crc kubenswrapper[5072]: I0228 04:34:06.672236 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08a2756a-1558-472d-8e33-f3b8009eadab" path="/var/lib/kubelet/pods/08a2756a-1558-472d-8e33-f3b8009eadab/volumes" Feb 28 04:34:06 crc kubenswrapper[5072]: I0228 04:34:06.966601 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcbwd" event={"ID":"dd97fece-aaaf-40eb-86b4-f0bbd198bf3f","Type":"ContainerStarted","Data":"2f92fb2617dd145cae358d9e2a386915c3070703877070705936465bcc4ce9a8"} Feb 28 04:34:06 crc kubenswrapper[5072]: I0228 04:34:06.988031 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kcbwd" podStartSLOduration=3.541117666 podStartE2EDuration="5.988015831s" podCreationTimestamp="2026-02-28 04:34:01 +0000 UTC" firstStartedPulling="2026-02-28 04:34:03.943824898 +0000 UTC m=+1465.938555090" lastFinishedPulling="2026-02-28 04:34:06.390723063 +0000 UTC m=+1468.385453255" observedRunningTime="2026-02-28 04:34:06.984425638 +0000 UTC m=+1468.979155850" watchObservedRunningTime="2026-02-28 04:34:06.988015831 +0000 UTC m=+1468.982746023" Feb 28 04:34:12 crc kubenswrapper[5072]: I0228 04:34:12.334634 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kcbwd" Feb 28 04:34:12 crc kubenswrapper[5072]: I0228 04:34:12.335403 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kcbwd" Feb 28 04:34:13 crc kubenswrapper[5072]: I0228 04:34:13.370219 5072 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kcbwd" podUID="dd97fece-aaaf-40eb-86b4-f0bbd198bf3f" containerName="registry-server" probeResult="failure" output=< Feb 28 04:34:13 crc kubenswrapper[5072]: timeout: failed to connect service ":50051" within 1s Feb 28 04:34:13 crc kubenswrapper[5072]: > Feb 28 04:34:21 crc kubenswrapper[5072]: I0228 04:34:21.143725 5072 scope.go:117] "RemoveContainer" containerID="1e397dde6a93677fc8e88c0f3e937758b2b430a51f0e349375967baad2b97822" Feb 28 04:34:21 crc kubenswrapper[5072]: I0228 04:34:21.187767 5072 scope.go:117] "RemoveContainer" containerID="c99c67d773d47537e40138ff4e744bc04738a379e6121f5997c86d12e6ec484e" Feb 28 04:34:21 crc kubenswrapper[5072]: I0228 04:34:21.221976 5072 scope.go:117] "RemoveContainer" containerID="ff1d0c7815426e13f87eebffa095214fb81ce4500673644df05798044ffac016" Feb 28 04:34:21 crc kubenswrapper[5072]: I0228 04:34:21.245704 5072 scope.go:117] "RemoveContainer" containerID="f9638a0cec913e593ae258b1ac7834421159ad66a6576002408944fd84951445" Feb 28 04:34:22 crc kubenswrapper[5072]: I0228 04:34:22.381979 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kcbwd" Feb 28 04:34:22 crc kubenswrapper[5072]: I0228 04:34:22.428209 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kcbwd" Feb 28 04:34:22 crc kubenswrapper[5072]: I0228 04:34:22.613394 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kcbwd"] Feb 28 04:34:24 crc kubenswrapper[5072]: I0228 04:34:24.064137 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kcbwd" podUID="dd97fece-aaaf-40eb-86b4-f0bbd198bf3f" containerName="registry-server" containerID="cri-o://2f92fb2617dd145cae358d9e2a386915c3070703877070705936465bcc4ce9a8" gracePeriod=2 Feb 28 04:34:25 crc kubenswrapper[5072]: I0228 04:34:25.000631 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kcbwd" Feb 28 04:34:25 crc kubenswrapper[5072]: I0228 04:34:25.085971 5072 generic.go:334] "Generic (PLEG): container finished" podID="dd97fece-aaaf-40eb-86b4-f0bbd198bf3f" containerID="2f92fb2617dd145cae358d9e2a386915c3070703877070705936465bcc4ce9a8" exitCode=0 Feb 28 04:34:25 crc kubenswrapper[5072]: I0228 04:34:25.086293 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcbwd" event={"ID":"dd97fece-aaaf-40eb-86b4-f0bbd198bf3f","Type":"ContainerDied","Data":"2f92fb2617dd145cae358d9e2a386915c3070703877070705936465bcc4ce9a8"} Feb 28 04:34:25 crc kubenswrapper[5072]: I0228 04:34:25.086325 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcbwd" event={"ID":"dd97fece-aaaf-40eb-86b4-f0bbd198bf3f","Type":"ContainerDied","Data":"5415ab43d8e1cc6dcb3b727590ff3b19bb483e39bd47f6e949aad86132b69e1d"} Feb 28 04:34:25 crc kubenswrapper[5072]: I0228 04:34:25.086346 5072 scope.go:117] "RemoveContainer" containerID="2f92fb2617dd145cae358d9e2a386915c3070703877070705936465bcc4ce9a8" Feb 28 04:34:25 crc kubenswrapper[5072]: I0228 04:34:25.086488 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kcbwd" Feb 28 04:34:25 crc kubenswrapper[5072]: I0228 04:34:25.106916 5072 scope.go:117] "RemoveContainer" containerID="c54e908260aea9a6478f66312dcde51790bdcfc9a85839569b455088eebf4221" Feb 28 04:34:25 crc kubenswrapper[5072]: I0228 04:34:25.126517 5072 scope.go:117] "RemoveContainer" containerID="61bfdc317d45f0ebc190db013b3e7a8742a0cbe34b6b7dbed929555f29468bb8" Feb 28 04:34:25 crc kubenswrapper[5072]: I0228 04:34:25.129873 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd97fece-aaaf-40eb-86b4-f0bbd198bf3f-utilities\") pod \"dd97fece-aaaf-40eb-86b4-f0bbd198bf3f\" (UID: \"dd97fece-aaaf-40eb-86b4-f0bbd198bf3f\") " Feb 28 04:34:25 crc kubenswrapper[5072]: I0228 04:34:25.129921 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98vdv\" (UniqueName: \"kubernetes.io/projected/dd97fece-aaaf-40eb-86b4-f0bbd198bf3f-kube-api-access-98vdv\") pod \"dd97fece-aaaf-40eb-86b4-f0bbd198bf3f\" (UID: \"dd97fece-aaaf-40eb-86b4-f0bbd198bf3f\") " Feb 28 04:34:25 crc kubenswrapper[5072]: I0228 04:34:25.129955 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd97fece-aaaf-40eb-86b4-f0bbd198bf3f-catalog-content\") pod \"dd97fece-aaaf-40eb-86b4-f0bbd198bf3f\" (UID: \"dd97fece-aaaf-40eb-86b4-f0bbd198bf3f\") " Feb 28 04:34:25 crc kubenswrapper[5072]: I0228 04:34:25.131611 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd97fece-aaaf-40eb-86b4-f0bbd198bf3f-utilities" (OuterVolumeSpecName: "utilities") pod "dd97fece-aaaf-40eb-86b4-f0bbd198bf3f" (UID: "dd97fece-aaaf-40eb-86b4-f0bbd198bf3f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:34:25 crc kubenswrapper[5072]: I0228 04:34:25.137855 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd97fece-aaaf-40eb-86b4-f0bbd198bf3f-kube-api-access-98vdv" (OuterVolumeSpecName: "kube-api-access-98vdv") pod "dd97fece-aaaf-40eb-86b4-f0bbd198bf3f" (UID: "dd97fece-aaaf-40eb-86b4-f0bbd198bf3f"). InnerVolumeSpecName "kube-api-access-98vdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:34:25 crc kubenswrapper[5072]: I0228 04:34:25.188091 5072 scope.go:117] "RemoveContainer" containerID="2f92fb2617dd145cae358d9e2a386915c3070703877070705936465bcc4ce9a8" Feb 28 04:34:25 crc kubenswrapper[5072]: E0228 04:34:25.189719 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f92fb2617dd145cae358d9e2a386915c3070703877070705936465bcc4ce9a8\": container with ID starting with 2f92fb2617dd145cae358d9e2a386915c3070703877070705936465bcc4ce9a8 not found: ID does not exist" containerID="2f92fb2617dd145cae358d9e2a386915c3070703877070705936465bcc4ce9a8" Feb 28 04:34:25 crc kubenswrapper[5072]: I0228 04:34:25.189765 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f92fb2617dd145cae358d9e2a386915c3070703877070705936465bcc4ce9a8"} err="failed to get container status \"2f92fb2617dd145cae358d9e2a386915c3070703877070705936465bcc4ce9a8\": rpc error: code = NotFound desc = could not find container \"2f92fb2617dd145cae358d9e2a386915c3070703877070705936465bcc4ce9a8\": container with ID starting with 2f92fb2617dd145cae358d9e2a386915c3070703877070705936465bcc4ce9a8 not found: ID does not exist" Feb 28 04:34:25 crc kubenswrapper[5072]: I0228 04:34:25.189791 5072 scope.go:117] "RemoveContainer" containerID="c54e908260aea9a6478f66312dcde51790bdcfc9a85839569b455088eebf4221" Feb 28 04:34:25 crc kubenswrapper[5072]: E0228 04:34:25.190212 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c54e908260aea9a6478f66312dcde51790bdcfc9a85839569b455088eebf4221\": container with ID starting with c54e908260aea9a6478f66312dcde51790bdcfc9a85839569b455088eebf4221 not found: ID does not exist" containerID="c54e908260aea9a6478f66312dcde51790bdcfc9a85839569b455088eebf4221" Feb 28 04:34:25 crc kubenswrapper[5072]: I0228 04:34:25.190296 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c54e908260aea9a6478f66312dcde51790bdcfc9a85839569b455088eebf4221"} err="failed to get container status \"c54e908260aea9a6478f66312dcde51790bdcfc9a85839569b455088eebf4221\": rpc error: code = NotFound desc = could not find container \"c54e908260aea9a6478f66312dcde51790bdcfc9a85839569b455088eebf4221\": container with ID starting with c54e908260aea9a6478f66312dcde51790bdcfc9a85839569b455088eebf4221 not found: ID does not exist" Feb 28 04:34:25 crc kubenswrapper[5072]: I0228 04:34:25.190355 5072 scope.go:117] "RemoveContainer" containerID="61bfdc317d45f0ebc190db013b3e7a8742a0cbe34b6b7dbed929555f29468bb8" Feb 28 04:34:25 crc kubenswrapper[5072]: E0228 04:34:25.191585 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61bfdc317d45f0ebc190db013b3e7a8742a0cbe34b6b7dbed929555f29468bb8\": container with ID starting with 61bfdc317d45f0ebc190db013b3e7a8742a0cbe34b6b7dbed929555f29468bb8 not found: ID does not exist" containerID="61bfdc317d45f0ebc190db013b3e7a8742a0cbe34b6b7dbed929555f29468bb8" Feb 28 04:34:25 crc kubenswrapper[5072]: I0228 04:34:25.191722 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61bfdc317d45f0ebc190db013b3e7a8742a0cbe34b6b7dbed929555f29468bb8"} err="failed to get container status \"61bfdc317d45f0ebc190db013b3e7a8742a0cbe34b6b7dbed929555f29468bb8\": rpc error: code = NotFound desc = could not find container \"61bfdc317d45f0ebc190db013b3e7a8742a0cbe34b6b7dbed929555f29468bb8\": container with ID starting with 61bfdc317d45f0ebc190db013b3e7a8742a0cbe34b6b7dbed929555f29468bb8 not found: ID does not exist" Feb 28 04:34:25 crc kubenswrapper[5072]: I0228 04:34:25.231618 5072 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd97fece-aaaf-40eb-86b4-f0bbd198bf3f-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 04:34:25 crc kubenswrapper[5072]: I0228 04:34:25.231674 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98vdv\" (UniqueName: \"kubernetes.io/projected/dd97fece-aaaf-40eb-86b4-f0bbd198bf3f-kube-api-access-98vdv\") on node \"crc\" DevicePath \"\"" Feb 28 04:34:25 crc kubenswrapper[5072]: I0228 04:34:25.257293 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd97fece-aaaf-40eb-86b4-f0bbd198bf3f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd97fece-aaaf-40eb-86b4-f0bbd198bf3f" (UID: "dd97fece-aaaf-40eb-86b4-f0bbd198bf3f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:34:25 crc kubenswrapper[5072]: I0228 04:34:25.333418 5072 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd97fece-aaaf-40eb-86b4-f0bbd198bf3f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 04:34:25 crc kubenswrapper[5072]: I0228 04:34:25.424307 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kcbwd"] Feb 28 04:34:25 crc kubenswrapper[5072]: I0228 04:34:25.429993 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kcbwd"] Feb 28 04:34:26 crc kubenswrapper[5072]: I0228 04:34:26.665151 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd97fece-aaaf-40eb-86b4-f0bbd198bf3f" path="/var/lib/kubelet/pods/dd97fece-aaaf-40eb-86b4-f0bbd198bf3f/volumes" Feb 28 04:35:10 crc kubenswrapper[5072]: I0228 04:35:10.401206 5072 generic.go:334] "Generic (PLEG): container finished" podID="22a7d4db-bde8-474f-8ba6-ff8332b7127f" containerID="efa5c887127247a93ccfabd4a42077c35c4eaf7cb54061d2962f201d57713d9a" exitCode=0 Feb 28 04:35:10 crc kubenswrapper[5072]: I0228 04:35:10.401312 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4p64t/must-gather-lj9ph" event={"ID":"22a7d4db-bde8-474f-8ba6-ff8332b7127f","Type":"ContainerDied","Data":"efa5c887127247a93ccfabd4a42077c35c4eaf7cb54061d2962f201d57713d9a"} Feb 28 04:35:10 crc kubenswrapper[5072]: I0228 04:35:10.402163 5072 scope.go:117] "RemoveContainer" containerID="efa5c887127247a93ccfabd4a42077c35c4eaf7cb54061d2962f201d57713d9a" Feb 28 04:35:10 crc kubenswrapper[5072]: I0228 04:35:10.503157 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4p64t_must-gather-lj9ph_22a7d4db-bde8-474f-8ba6-ff8332b7127f/gather/0.log" Feb 28 04:35:17 crc kubenswrapper[5072]: I0228 04:35:17.580343 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4p64t/must-gather-lj9ph"] Feb 28 04:35:17 crc kubenswrapper[5072]: I0228 04:35:17.581157 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-4p64t/must-gather-lj9ph" podUID="22a7d4db-bde8-474f-8ba6-ff8332b7127f" containerName="copy" containerID="cri-o://0a1a3e19e801de01e9cc2ae55385f65461b7e49998f0a9c462404b67c2d3b233" gracePeriod=2 Feb 28 04:35:17 crc kubenswrapper[5072]: I0228 04:35:17.584459 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4p64t/must-gather-lj9ph"] Feb 28 04:35:17 crc kubenswrapper[5072]: I0228 04:35:17.943791 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4p64t_must-gather-lj9ph_22a7d4db-bde8-474f-8ba6-ff8332b7127f/copy/0.log" Feb 28 04:35:17 crc kubenswrapper[5072]: I0228 04:35:17.944403 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4p64t/must-gather-lj9ph" Feb 28 04:35:18 crc kubenswrapper[5072]: I0228 04:35:18.053817 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/22a7d4db-bde8-474f-8ba6-ff8332b7127f-must-gather-output\") pod \"22a7d4db-bde8-474f-8ba6-ff8332b7127f\" (UID: \"22a7d4db-bde8-474f-8ba6-ff8332b7127f\") " Feb 28 04:35:18 crc kubenswrapper[5072]: I0228 04:35:18.053904 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvfjg\" (UniqueName: \"kubernetes.io/projected/22a7d4db-bde8-474f-8ba6-ff8332b7127f-kube-api-access-nvfjg\") pod \"22a7d4db-bde8-474f-8ba6-ff8332b7127f\" (UID: \"22a7d4db-bde8-474f-8ba6-ff8332b7127f\") " Feb 28 04:35:18 crc kubenswrapper[5072]: I0228 04:35:18.061815 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22a7d4db-bde8-474f-8ba6-ff8332b7127f-kube-api-access-nvfjg" (OuterVolumeSpecName: "kube-api-access-nvfjg") pod "22a7d4db-bde8-474f-8ba6-ff8332b7127f" (UID: "22a7d4db-bde8-474f-8ba6-ff8332b7127f"). InnerVolumeSpecName "kube-api-access-nvfjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:35:18 crc kubenswrapper[5072]: I0228 04:35:18.113273 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22a7d4db-bde8-474f-8ba6-ff8332b7127f-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "22a7d4db-bde8-474f-8ba6-ff8332b7127f" (UID: "22a7d4db-bde8-474f-8ba6-ff8332b7127f"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:35:18 crc kubenswrapper[5072]: I0228 04:35:18.155616 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvfjg\" (UniqueName: \"kubernetes.io/projected/22a7d4db-bde8-474f-8ba6-ff8332b7127f-kube-api-access-nvfjg\") on node \"crc\" DevicePath \"\"" Feb 28 04:35:18 crc kubenswrapper[5072]: I0228 04:35:18.155896 5072 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/22a7d4db-bde8-474f-8ba6-ff8332b7127f-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 28 04:35:18 crc kubenswrapper[5072]: I0228 04:35:18.451785 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4p64t_must-gather-lj9ph_22a7d4db-bde8-474f-8ba6-ff8332b7127f/copy/0.log" Feb 28 04:35:18 crc kubenswrapper[5072]: I0228 04:35:18.452093 5072 generic.go:334] "Generic (PLEG): container finished" podID="22a7d4db-bde8-474f-8ba6-ff8332b7127f" containerID="0a1a3e19e801de01e9cc2ae55385f65461b7e49998f0a9c462404b67c2d3b233" exitCode=143 Feb 28 04:35:18 crc kubenswrapper[5072]: I0228 04:35:18.452138 5072 scope.go:117] "RemoveContainer" containerID="0a1a3e19e801de01e9cc2ae55385f65461b7e49998f0a9c462404b67c2d3b233" Feb 28 04:35:18 crc kubenswrapper[5072]: I0228 04:35:18.452151 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4p64t/must-gather-lj9ph" Feb 28 04:35:18 crc kubenswrapper[5072]: I0228 04:35:18.468796 5072 scope.go:117] "RemoveContainer" containerID="efa5c887127247a93ccfabd4a42077c35c4eaf7cb54061d2962f201d57713d9a" Feb 28 04:35:18 crc kubenswrapper[5072]: I0228 04:35:18.503682 5072 scope.go:117] "RemoveContainer" containerID="0a1a3e19e801de01e9cc2ae55385f65461b7e49998f0a9c462404b67c2d3b233" Feb 28 04:35:18 crc kubenswrapper[5072]: E0228 04:35:18.504208 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a1a3e19e801de01e9cc2ae55385f65461b7e49998f0a9c462404b67c2d3b233\": container with ID starting with 0a1a3e19e801de01e9cc2ae55385f65461b7e49998f0a9c462404b67c2d3b233 not found: ID does not exist" containerID="0a1a3e19e801de01e9cc2ae55385f65461b7e49998f0a9c462404b67c2d3b233" Feb 28 04:35:18 crc kubenswrapper[5072]: I0228 04:35:18.504270 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a1a3e19e801de01e9cc2ae55385f65461b7e49998f0a9c462404b67c2d3b233"} err="failed to get container status \"0a1a3e19e801de01e9cc2ae55385f65461b7e49998f0a9c462404b67c2d3b233\": rpc error: code = NotFound desc = could not find container \"0a1a3e19e801de01e9cc2ae55385f65461b7e49998f0a9c462404b67c2d3b233\": container with ID starting with 0a1a3e19e801de01e9cc2ae55385f65461b7e49998f0a9c462404b67c2d3b233 not found: ID does not exist" Feb 28 04:35:18 crc kubenswrapper[5072]: I0228 04:35:18.504313 5072 scope.go:117] "RemoveContainer" containerID="efa5c887127247a93ccfabd4a42077c35c4eaf7cb54061d2962f201d57713d9a" Feb 28 04:35:18 crc kubenswrapper[5072]: E0228 04:35:18.504713 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efa5c887127247a93ccfabd4a42077c35c4eaf7cb54061d2962f201d57713d9a\": container with ID starting with efa5c887127247a93ccfabd4a42077c35c4eaf7cb54061d2962f201d57713d9a not found: ID does not exist" containerID="efa5c887127247a93ccfabd4a42077c35c4eaf7cb54061d2962f201d57713d9a" Feb 28 04:35:18 crc kubenswrapper[5072]: I0228 04:35:18.504766 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efa5c887127247a93ccfabd4a42077c35c4eaf7cb54061d2962f201d57713d9a"} err="failed to get container status \"efa5c887127247a93ccfabd4a42077c35c4eaf7cb54061d2962f201d57713d9a\": rpc error: code = NotFound desc = could not find container \"efa5c887127247a93ccfabd4a42077c35c4eaf7cb54061d2962f201d57713d9a\": container with ID starting with efa5c887127247a93ccfabd4a42077c35c4eaf7cb54061d2962f201d57713d9a not found: ID does not exist" Feb 28 04:35:18 crc kubenswrapper[5072]: I0228 04:35:18.667049 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22a7d4db-bde8-474f-8ba6-ff8332b7127f" path="/var/lib/kubelet/pods/22a7d4db-bde8-474f-8ba6-ff8332b7127f/volumes" Feb 28 04:35:20 crc kubenswrapper[5072]: I0228 04:35:20.105849 5072 patch_prober.go:28] interesting pod/machine-config-daemon-5lrpf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:35:20 crc kubenswrapper[5072]: I0228 04:35:20.106188 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:35:21 crc kubenswrapper[5072]: I0228 04:35:21.303780 5072 scope.go:117] "RemoveContainer" containerID="4d8a41aacb1e93112baac1a853fd5334e2d362789f3d02d90928e91c78a0e690" Feb 28 04:35:21 crc kubenswrapper[5072]: I0228 04:35:21.320512 5072 scope.go:117] "RemoveContainer" containerID="a25b28fed269ede81a5fc760a79dfb9947cece8bd5bbc9f8b86506a2ad6f18ec" Feb 28 04:35:21 crc kubenswrapper[5072]: I0228 04:35:21.339406 5072 scope.go:117] "RemoveContainer" containerID="4683ec73a7f6874e4e5185b13be360203fa99747a921ac6c540975beca0ed9f5" Feb 28 04:35:21 crc kubenswrapper[5072]: I0228 04:35:21.356346 5072 scope.go:117] "RemoveContainer" containerID="24d89ef02fafea74f917037f7a02488a825cb5c52148d4d1e00fad0f9a2149fb" Feb 28 04:35:21 crc kubenswrapper[5072]: I0228 04:35:21.377018 5072 scope.go:117] "RemoveContainer" containerID="3bcec2ae5530ff7b04b1f31f5f972f7f68cf676d949e3a12f3852a7bb1ab9ecb" Feb 28 04:35:21 crc kubenswrapper[5072]: I0228 04:35:21.405631 5072 scope.go:117] "RemoveContainer" containerID="0cf647178618894fa41d0aabb4dd23ee31971bc7401ba75c71d69b60e4e65c83" Feb 28 04:35:21 crc kubenswrapper[5072]: I0228 04:35:21.447404 5072 scope.go:117] "RemoveContainer" containerID="0753d3168a45c1b43d11c6abbc0f2d6e050f3147149778501091ade7e9988331" Feb 28 04:35:50 crc kubenswrapper[5072]: I0228 04:35:50.105948 5072 patch_prober.go:28] interesting pod/machine-config-daemon-5lrpf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:35:50 crc kubenswrapper[5072]: I0228 04:35:50.107060 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:35:57 crc kubenswrapper[5072]: I0228 04:35:57.604514 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-44tkt"] Feb 28 04:35:57 crc kubenswrapper[5072]: E0228 04:35:57.605109 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd97fece-aaaf-40eb-86b4-f0bbd198bf3f" containerName="extract-content" Feb 28 04:35:57 crc kubenswrapper[5072]: I0228 04:35:57.605122 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd97fece-aaaf-40eb-86b4-f0bbd198bf3f" containerName="extract-content" Feb 28 04:35:57 crc kubenswrapper[5072]: E0228 04:35:57.605132 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22a7d4db-bde8-474f-8ba6-ff8332b7127f" containerName="gather" Feb 28 04:35:57 crc kubenswrapper[5072]: I0228 04:35:57.605138 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="22a7d4db-bde8-474f-8ba6-ff8332b7127f" containerName="gather" Feb 28 04:35:57 crc kubenswrapper[5072]: E0228 04:35:57.605144 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd97fece-aaaf-40eb-86b4-f0bbd198bf3f" containerName="registry-server" Feb 28 04:35:57 crc kubenswrapper[5072]: I0228 04:35:57.605151 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd97fece-aaaf-40eb-86b4-f0bbd198bf3f" containerName="registry-server" Feb 28 04:35:57 crc kubenswrapper[5072]: E0228 04:35:57.605162 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd97fece-aaaf-40eb-86b4-f0bbd198bf3f" containerName="extract-utilities" Feb 28 04:35:57 crc kubenswrapper[5072]: I0228 04:35:57.605167 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd97fece-aaaf-40eb-86b4-f0bbd198bf3f" containerName="extract-utilities" Feb 28 04:35:57 crc kubenswrapper[5072]: E0228 04:35:57.605178 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d76583e4-1977-4ec3-a097-b0e22f3569dc" containerName="oc" Feb 28 04:35:57 crc kubenswrapper[5072]: I0228 04:35:57.605183 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="d76583e4-1977-4ec3-a097-b0e22f3569dc" containerName="oc" Feb 28 04:35:57 crc kubenswrapper[5072]: E0228 04:35:57.605252 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22a7d4db-bde8-474f-8ba6-ff8332b7127f" containerName="copy" Feb 28 04:35:57 crc kubenswrapper[5072]: I0228 04:35:57.605259 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="22a7d4db-bde8-474f-8ba6-ff8332b7127f" containerName="copy" Feb 28 04:35:57 crc kubenswrapper[5072]: I0228 04:35:57.605356 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="22a7d4db-bde8-474f-8ba6-ff8332b7127f" containerName="copy" Feb 28 04:35:57 crc kubenswrapper[5072]: I0228 04:35:57.605368 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="d76583e4-1977-4ec3-a097-b0e22f3569dc" containerName="oc" Feb 28 04:35:57 crc kubenswrapper[5072]: I0228 04:35:57.605376 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="22a7d4db-bde8-474f-8ba6-ff8332b7127f" containerName="gather" Feb 28 04:35:57 crc kubenswrapper[5072]: I0228 04:35:57.605384 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd97fece-aaaf-40eb-86b4-f0bbd198bf3f" containerName="registry-server" Feb 28 04:35:57 crc kubenswrapper[5072]: I0228 04:35:57.606111 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-44tkt" Feb 28 04:35:57 crc kubenswrapper[5072]: I0228 04:35:57.653318 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-44tkt"] Feb 28 04:35:57 crc kubenswrapper[5072]: I0228 04:35:57.774466 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f27be646-2a93-4034-bde3-2aeebf9bbc81-catalog-content\") pod \"community-operators-44tkt\" (UID: \"f27be646-2a93-4034-bde3-2aeebf9bbc81\") " pod="openshift-marketplace/community-operators-44tkt" Feb 28 04:35:57 crc kubenswrapper[5072]: I0228 04:35:57.774816 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-527ml\" (UniqueName: \"kubernetes.io/projected/f27be646-2a93-4034-bde3-2aeebf9bbc81-kube-api-access-527ml\") pod \"community-operators-44tkt\" (UID: \"f27be646-2a93-4034-bde3-2aeebf9bbc81\") " pod="openshift-marketplace/community-operators-44tkt" Feb 28 04:35:57 crc kubenswrapper[5072]: I0228 04:35:57.774852 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f27be646-2a93-4034-bde3-2aeebf9bbc81-utilities\") pod \"community-operators-44tkt\" (UID: \"f27be646-2a93-4034-bde3-2aeebf9bbc81\") " pod="openshift-marketplace/community-operators-44tkt" Feb 28 04:35:57 crc kubenswrapper[5072]: I0228 04:35:57.876604 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f27be646-2a93-4034-bde3-2aeebf9bbc81-catalog-content\") pod \"community-operators-44tkt\" (UID: \"f27be646-2a93-4034-bde3-2aeebf9bbc81\") " pod="openshift-marketplace/community-operators-44tkt" Feb 28 04:35:57 crc kubenswrapper[5072]: I0228 04:35:57.877071 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-527ml\" (UniqueName: \"kubernetes.io/projected/f27be646-2a93-4034-bde3-2aeebf9bbc81-kube-api-access-527ml\") pod \"community-operators-44tkt\" (UID: \"f27be646-2a93-4034-bde3-2aeebf9bbc81\") " pod="openshift-marketplace/community-operators-44tkt" Feb 28 04:35:57 crc kubenswrapper[5072]: I0228 04:35:57.877029 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f27be646-2a93-4034-bde3-2aeebf9bbc81-catalog-content\") pod \"community-operators-44tkt\" (UID: \"f27be646-2a93-4034-bde3-2aeebf9bbc81\") " pod="openshift-marketplace/community-operators-44tkt" Feb 28 04:35:57 crc kubenswrapper[5072]: I0228 04:35:57.877119 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f27be646-2a93-4034-bde3-2aeebf9bbc81-utilities\") pod \"community-operators-44tkt\" (UID: \"f27be646-2a93-4034-bde3-2aeebf9bbc81\") " pod="openshift-marketplace/community-operators-44tkt" Feb 28 04:35:57 crc kubenswrapper[5072]: I0228 04:35:57.877719 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f27be646-2a93-4034-bde3-2aeebf9bbc81-utilities\") pod \"community-operators-44tkt\" (UID: \"f27be646-2a93-4034-bde3-2aeebf9bbc81\") " pod="openshift-marketplace/community-operators-44tkt" Feb 28 04:35:57 crc kubenswrapper[5072]: I0228 04:35:57.899485 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-527ml\" (UniqueName: \"kubernetes.io/projected/f27be646-2a93-4034-bde3-2aeebf9bbc81-kube-api-access-527ml\") pod \"community-operators-44tkt\" (UID: \"f27be646-2a93-4034-bde3-2aeebf9bbc81\") " pod="openshift-marketplace/community-operators-44tkt" Feb 28 04:35:57 crc kubenswrapper[5072]: I0228 04:35:57.940238 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-44tkt" Feb 28 04:35:58 crc kubenswrapper[5072]: I0228 04:35:58.384906 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-44tkt"] Feb 28 04:35:58 crc kubenswrapper[5072]: I0228 04:35:58.712371 5072 generic.go:334] "Generic (PLEG): container finished" podID="f27be646-2a93-4034-bde3-2aeebf9bbc81" containerID="8f842a63cff615e02110ecfc4fe109639ca5cd4cf7c2c6a22ca4c75d941eb9bd" exitCode=0 Feb 28 04:35:58 crc kubenswrapper[5072]: I0228 04:35:58.712415 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-44tkt" event={"ID":"f27be646-2a93-4034-bde3-2aeebf9bbc81","Type":"ContainerDied","Data":"8f842a63cff615e02110ecfc4fe109639ca5cd4cf7c2c6a22ca4c75d941eb9bd"} Feb 28 04:35:58 crc kubenswrapper[5072]: I0228 04:35:58.712439 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-44tkt" event={"ID":"f27be646-2a93-4034-bde3-2aeebf9bbc81","Type":"ContainerStarted","Data":"5dcde239610f000cecf60b0c7b0683bac23ad43c28fb5b95640b311bd6501528"} Feb 28 04:36:00 crc kubenswrapper[5072]: I0228 04:36:00.125980 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537556-66srt"] Feb 28 04:36:00 crc kubenswrapper[5072]: I0228 04:36:00.126628 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537556-66srt" Feb 28 04:36:00 crc kubenswrapper[5072]: I0228 04:36:00.132831 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 04:36:00 crc kubenswrapper[5072]: I0228 04:36:00.133025 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-c8kvx" Feb 28 04:36:00 crc kubenswrapper[5072]: I0228 04:36:00.134271 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 04:36:00 crc kubenswrapper[5072]: I0228 04:36:00.139425 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537556-66srt"] Feb 28 04:36:00 crc kubenswrapper[5072]: I0228 04:36:00.205885 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sptdw\" (UniqueName: \"kubernetes.io/projected/24f274e3-9468-4418-9b22-ccda48145e19-kube-api-access-sptdw\") pod \"auto-csr-approver-29537556-66srt\" (UID: \"24f274e3-9468-4418-9b22-ccda48145e19\") " pod="openshift-infra/auto-csr-approver-29537556-66srt" Feb 28 04:36:00 crc kubenswrapper[5072]: I0228 04:36:00.306881 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sptdw\" (UniqueName: \"kubernetes.io/projected/24f274e3-9468-4418-9b22-ccda48145e19-kube-api-access-sptdw\") pod \"auto-csr-approver-29537556-66srt\" (UID: \"24f274e3-9468-4418-9b22-ccda48145e19\") " pod="openshift-infra/auto-csr-approver-29537556-66srt" Feb 28 04:36:00 crc kubenswrapper[5072]: I0228 04:36:00.330696 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sptdw\" (UniqueName: \"kubernetes.io/projected/24f274e3-9468-4418-9b22-ccda48145e19-kube-api-access-sptdw\") pod \"auto-csr-approver-29537556-66srt\" (UID: \"24f274e3-9468-4418-9b22-ccda48145e19\") " pod="openshift-infra/auto-csr-approver-29537556-66srt" Feb 28 04:36:00 crc kubenswrapper[5072]: I0228 04:36:00.443085 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537556-66srt" Feb 28 04:36:01 crc kubenswrapper[5072]: I0228 04:36:01.039100 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537556-66srt"] Feb 28 04:36:01 crc kubenswrapper[5072]: W0228 04:36:01.115676 5072 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24f274e3_9468_4418_9b22_ccda48145e19.slice/crio-8bad563ea46ab1ac03ce9ebd6893d17d5d1ea3b269e52072503e173d5b03a7c5 WatchSource:0}: Error finding container 8bad563ea46ab1ac03ce9ebd6893d17d5d1ea3b269e52072503e173d5b03a7c5: Status 404 returned error can't find the container with id 8bad563ea46ab1ac03ce9ebd6893d17d5d1ea3b269e52072503e173d5b03a7c5 Feb 28 04:36:01 crc kubenswrapper[5072]: I0228 04:36:01.751597 5072 generic.go:334] "Generic (PLEG): container finished" podID="f27be646-2a93-4034-bde3-2aeebf9bbc81" containerID="8e707974cc52d59265cce57353681486f51d79b84eb8dcd68cce822d0caacdac" exitCode=0 Feb 28 04:36:01 crc kubenswrapper[5072]: I0228 04:36:01.751699 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-44tkt" event={"ID":"f27be646-2a93-4034-bde3-2aeebf9bbc81","Type":"ContainerDied","Data":"8e707974cc52d59265cce57353681486f51d79b84eb8dcd68cce822d0caacdac"} Feb 28 04:36:01 crc kubenswrapper[5072]: I0228 04:36:01.762425 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537556-66srt" event={"ID":"24f274e3-9468-4418-9b22-ccda48145e19","Type":"ContainerStarted","Data":"8bad563ea46ab1ac03ce9ebd6893d17d5d1ea3b269e52072503e173d5b03a7c5"} Feb 28 04:36:03 crc kubenswrapper[5072]: I0228 04:36:03.775833 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-44tkt" event={"ID":"f27be646-2a93-4034-bde3-2aeebf9bbc81","Type":"ContainerStarted","Data":"513ce44ad45340806b953368a5123df2665252c9b9a4cf88919d9c329282f96b"} Feb 28 04:36:03 crc kubenswrapper[5072]: I0228 04:36:03.778803 5072 generic.go:334] "Generic (PLEG): container finished" podID="24f274e3-9468-4418-9b22-ccda48145e19" containerID="c32e3fb8e067a9f85a7dab7e939e8d29d3259641af0f724a1e446f92cabdfa52" exitCode=0 Feb 28 04:36:03 crc kubenswrapper[5072]: I0228 04:36:03.778850 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537556-66srt" event={"ID":"24f274e3-9468-4418-9b22-ccda48145e19","Type":"ContainerDied","Data":"c32e3fb8e067a9f85a7dab7e939e8d29d3259641af0f724a1e446f92cabdfa52"} Feb 28 04:36:03 crc kubenswrapper[5072]: I0228 04:36:03.794313 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-44tkt" podStartSLOduration=2.612871469 podStartE2EDuration="6.794292657s" podCreationTimestamp="2026-02-28 04:35:57 +0000 UTC" firstStartedPulling="2026-02-28 04:35:58.714273118 +0000 UTC m=+1580.709003310" lastFinishedPulling="2026-02-28 04:36:02.895694306 +0000 UTC m=+1584.890424498" observedRunningTime="2026-02-28 04:36:03.794248695 +0000 UTC m=+1585.788978887" watchObservedRunningTime="2026-02-28 04:36:03.794292657 +0000 UTC m=+1585.789022869" Feb 28 04:36:04 crc kubenswrapper[5072]: I0228 04:36:04.978235 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537556-66srt" Feb 28 04:36:05 crc kubenswrapper[5072]: I0228 04:36:05.168044 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sptdw\" (UniqueName: \"kubernetes.io/projected/24f274e3-9468-4418-9b22-ccda48145e19-kube-api-access-sptdw\") pod \"24f274e3-9468-4418-9b22-ccda48145e19\" (UID: \"24f274e3-9468-4418-9b22-ccda48145e19\") " Feb 28 04:36:05 crc kubenswrapper[5072]: I0228 04:36:05.178553 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24f274e3-9468-4418-9b22-ccda48145e19-kube-api-access-sptdw" (OuterVolumeSpecName: "kube-api-access-sptdw") pod "24f274e3-9468-4418-9b22-ccda48145e19" (UID: "24f274e3-9468-4418-9b22-ccda48145e19"). InnerVolumeSpecName "kube-api-access-sptdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:36:05 crc kubenswrapper[5072]: I0228 04:36:05.269909 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sptdw\" (UniqueName: \"kubernetes.io/projected/24f274e3-9468-4418-9b22-ccda48145e19-kube-api-access-sptdw\") on node \"crc\" DevicePath \"\"" Feb 28 04:36:05 crc kubenswrapper[5072]: I0228 04:36:05.792416 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537556-66srt" event={"ID":"24f274e3-9468-4418-9b22-ccda48145e19","Type":"ContainerDied","Data":"8bad563ea46ab1ac03ce9ebd6893d17d5d1ea3b269e52072503e173d5b03a7c5"} Feb 28 04:36:05 crc kubenswrapper[5072]: I0228 04:36:05.792466 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bad563ea46ab1ac03ce9ebd6893d17d5d1ea3b269e52072503e173d5b03a7c5" Feb 28 04:36:05 crc kubenswrapper[5072]: I0228 04:36:05.792521 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537556-66srt" Feb 28 04:36:06 crc kubenswrapper[5072]: I0228 04:36:06.032067 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537550-stc8d"] Feb 28 04:36:06 crc kubenswrapper[5072]: I0228 04:36:06.034300 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537550-stc8d"] Feb 28 04:36:06 crc kubenswrapper[5072]: I0228 04:36:06.667835 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17749401-23d2-4c37-b692-9f163e29b7b7" path="/var/lib/kubelet/pods/17749401-23d2-4c37-b692-9f163e29b7b7/volumes" Feb 28 04:36:07 crc kubenswrapper[5072]: I0228 04:36:07.940487 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-44tkt" Feb 28 04:36:07 crc kubenswrapper[5072]: I0228 04:36:07.940532 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-44tkt" Feb 28 04:36:07 crc kubenswrapper[5072]: I0228 04:36:07.984309 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-44tkt" Feb 28 04:36:08 crc kubenswrapper[5072]: I0228 04:36:08.845312 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-44tkt" Feb 28 04:36:08 crc kubenswrapper[5072]: I0228 04:36:08.885684 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-44tkt"] Feb 28 04:36:10 crc kubenswrapper[5072]: I0228 04:36:10.818897 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-44tkt" podUID="f27be646-2a93-4034-bde3-2aeebf9bbc81" containerName="registry-server" containerID="cri-o://513ce44ad45340806b953368a5123df2665252c9b9a4cf88919d9c329282f96b" gracePeriod=2 Feb 28 04:36:11 crc kubenswrapper[5072]: I0228 04:36:11.676787 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-44tkt" Feb 28 04:36:11 crc kubenswrapper[5072]: I0228 04:36:11.830870 5072 generic.go:334] "Generic (PLEG): container finished" podID="f27be646-2a93-4034-bde3-2aeebf9bbc81" containerID="513ce44ad45340806b953368a5123df2665252c9b9a4cf88919d9c329282f96b" exitCode=0 Feb 28 04:36:11 crc kubenswrapper[5072]: I0228 04:36:11.830915 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-44tkt" event={"ID":"f27be646-2a93-4034-bde3-2aeebf9bbc81","Type":"ContainerDied","Data":"513ce44ad45340806b953368a5123df2665252c9b9a4cf88919d9c329282f96b"} Feb 28 04:36:11 crc kubenswrapper[5072]: I0228 04:36:11.830942 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-44tkt" event={"ID":"f27be646-2a93-4034-bde3-2aeebf9bbc81","Type":"ContainerDied","Data":"5dcde239610f000cecf60b0c7b0683bac23ad43c28fb5b95640b311bd6501528"} Feb 28 04:36:11 crc kubenswrapper[5072]: I0228 04:36:11.830958 5072 scope.go:117] "RemoveContainer" containerID="513ce44ad45340806b953368a5123df2665252c9b9a4cf88919d9c329282f96b" Feb 28 04:36:11 crc kubenswrapper[5072]: I0228 04:36:11.831014 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-44tkt" Feb 28 04:36:11 crc kubenswrapper[5072]: I0228 04:36:11.843986 5072 scope.go:117] "RemoveContainer" containerID="8e707974cc52d59265cce57353681486f51d79b84eb8dcd68cce822d0caacdac" Feb 28 04:36:11 crc kubenswrapper[5072]: I0228 04:36:11.852520 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-527ml\" (UniqueName: \"kubernetes.io/projected/f27be646-2a93-4034-bde3-2aeebf9bbc81-kube-api-access-527ml\") pod \"f27be646-2a93-4034-bde3-2aeebf9bbc81\" (UID: \"f27be646-2a93-4034-bde3-2aeebf9bbc81\") " Feb 28 04:36:11 crc kubenswrapper[5072]: I0228 04:36:11.852709 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f27be646-2a93-4034-bde3-2aeebf9bbc81-catalog-content\") pod \"f27be646-2a93-4034-bde3-2aeebf9bbc81\" (UID: \"f27be646-2a93-4034-bde3-2aeebf9bbc81\") " Feb 28 04:36:11 crc kubenswrapper[5072]: I0228 04:36:11.852807 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f27be646-2a93-4034-bde3-2aeebf9bbc81-utilities\") pod \"f27be646-2a93-4034-bde3-2aeebf9bbc81\" (UID: \"f27be646-2a93-4034-bde3-2aeebf9bbc81\") " Feb 28 04:36:11 crc kubenswrapper[5072]: I0228 04:36:11.855582 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f27be646-2a93-4034-bde3-2aeebf9bbc81-utilities" (OuterVolumeSpecName: "utilities") pod "f27be646-2a93-4034-bde3-2aeebf9bbc81" (UID: "f27be646-2a93-4034-bde3-2aeebf9bbc81"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:36:11 crc kubenswrapper[5072]: I0228 04:36:11.861882 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f27be646-2a93-4034-bde3-2aeebf9bbc81-kube-api-access-527ml" (OuterVolumeSpecName: "kube-api-access-527ml") pod "f27be646-2a93-4034-bde3-2aeebf9bbc81" (UID: "f27be646-2a93-4034-bde3-2aeebf9bbc81"). InnerVolumeSpecName "kube-api-access-527ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:36:11 crc kubenswrapper[5072]: I0228 04:36:11.864587 5072 scope.go:117] "RemoveContainer" containerID="8f842a63cff615e02110ecfc4fe109639ca5cd4cf7c2c6a22ca4c75d941eb9bd" Feb 28 04:36:11 crc kubenswrapper[5072]: I0228 04:36:11.889716 5072 scope.go:117] "RemoveContainer" containerID="513ce44ad45340806b953368a5123df2665252c9b9a4cf88919d9c329282f96b" Feb 28 04:36:11 crc kubenswrapper[5072]: E0228 04:36:11.890346 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"513ce44ad45340806b953368a5123df2665252c9b9a4cf88919d9c329282f96b\": container with ID starting with 513ce44ad45340806b953368a5123df2665252c9b9a4cf88919d9c329282f96b not found: ID does not exist" containerID="513ce44ad45340806b953368a5123df2665252c9b9a4cf88919d9c329282f96b" Feb 28 04:36:11 crc kubenswrapper[5072]: I0228 04:36:11.890421 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"513ce44ad45340806b953368a5123df2665252c9b9a4cf88919d9c329282f96b"} err="failed to get container status \"513ce44ad45340806b953368a5123df2665252c9b9a4cf88919d9c329282f96b\": rpc error: code = NotFound desc = could not find container \"513ce44ad45340806b953368a5123df2665252c9b9a4cf88919d9c329282f96b\": container with ID starting with 513ce44ad45340806b953368a5123df2665252c9b9a4cf88919d9c329282f96b not found: ID does not exist" Feb 28 04:36:11 crc kubenswrapper[5072]: I0228 04:36:11.890469 5072 scope.go:117] "RemoveContainer" containerID="8e707974cc52d59265cce57353681486f51d79b84eb8dcd68cce822d0caacdac" Feb 28 04:36:11 crc kubenswrapper[5072]: E0228 04:36:11.891066 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e707974cc52d59265cce57353681486f51d79b84eb8dcd68cce822d0caacdac\": container with ID starting with 8e707974cc52d59265cce57353681486f51d79b84eb8dcd68cce822d0caacdac not found: ID does not exist" containerID="8e707974cc52d59265cce57353681486f51d79b84eb8dcd68cce822d0caacdac" Feb 28 04:36:11 crc kubenswrapper[5072]: I0228 04:36:11.891113 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e707974cc52d59265cce57353681486f51d79b84eb8dcd68cce822d0caacdac"} err="failed to get container status \"8e707974cc52d59265cce57353681486f51d79b84eb8dcd68cce822d0caacdac\": rpc error: code = NotFound desc = could not find container \"8e707974cc52d59265cce57353681486f51d79b84eb8dcd68cce822d0caacdac\": container with ID starting with 8e707974cc52d59265cce57353681486f51d79b84eb8dcd68cce822d0caacdac not found: ID does not exist" Feb 28 04:36:11 crc kubenswrapper[5072]: I0228 04:36:11.891141 5072 scope.go:117] "RemoveContainer" containerID="8f842a63cff615e02110ecfc4fe109639ca5cd4cf7c2c6a22ca4c75d941eb9bd" Feb 28 04:36:11 crc kubenswrapper[5072]: E0228 04:36:11.891847 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f842a63cff615e02110ecfc4fe109639ca5cd4cf7c2c6a22ca4c75d941eb9bd\": container with ID starting with 8f842a63cff615e02110ecfc4fe109639ca5cd4cf7c2c6a22ca4c75d941eb9bd not found: ID does not exist" containerID="8f842a63cff615e02110ecfc4fe109639ca5cd4cf7c2c6a22ca4c75d941eb9bd" Feb 28 04:36:11 crc kubenswrapper[5072]: I0228 04:36:11.891891 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f842a63cff615e02110ecfc4fe109639ca5cd4cf7c2c6a22ca4c75d941eb9bd"} err="failed to get container status \"8f842a63cff615e02110ecfc4fe109639ca5cd4cf7c2c6a22ca4c75d941eb9bd\": rpc error: code = NotFound desc = could not find container \"8f842a63cff615e02110ecfc4fe109639ca5cd4cf7c2c6a22ca4c75d941eb9bd\": container with ID starting with 8f842a63cff615e02110ecfc4fe109639ca5cd4cf7c2c6a22ca4c75d941eb9bd not found: ID does not exist" Feb 28 04:36:11 crc kubenswrapper[5072]: I0228 04:36:11.958919 5072 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f27be646-2a93-4034-bde3-2aeebf9bbc81-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 04:36:11 crc kubenswrapper[5072]: I0228 04:36:11.958965 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-527ml\" (UniqueName: \"kubernetes.io/projected/f27be646-2a93-4034-bde3-2aeebf9bbc81-kube-api-access-527ml\") on node \"crc\" DevicePath \"\"" Feb 28 04:36:14 crc kubenswrapper[5072]: I0228 04:36:14.547867 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f27be646-2a93-4034-bde3-2aeebf9bbc81-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f27be646-2a93-4034-bde3-2aeebf9bbc81" (UID: "f27be646-2a93-4034-bde3-2aeebf9bbc81"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:36:14 crc kubenswrapper[5072]: I0228 04:36:14.593102 5072 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f27be646-2a93-4034-bde3-2aeebf9bbc81-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 04:36:14 crc kubenswrapper[5072]: I0228 04:36:14.846808 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-44tkt"] Feb 28 04:36:14 crc kubenswrapper[5072]: I0228 04:36:14.856285 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-44tkt"] Feb 28 04:36:16 crc kubenswrapper[5072]: I0228 04:36:16.665366 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f27be646-2a93-4034-bde3-2aeebf9bbc81" path="/var/lib/kubelet/pods/f27be646-2a93-4034-bde3-2aeebf9bbc81/volumes" Feb 28 04:36:20 crc kubenswrapper[5072]: I0228 04:36:20.106288 5072 patch_prober.go:28] interesting pod/machine-config-daemon-5lrpf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:36:20 crc kubenswrapper[5072]: I0228 04:36:20.106979 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:36:20 crc kubenswrapper[5072]: I0228 04:36:20.107064 5072 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" Feb 28 04:36:20 crc kubenswrapper[5072]: I0228 04:36:20.108225 5072 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0e386676382b0c8efe27f89db4a4d5fa8b4258c234eaa3b9c041a33c00eda272"} pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 28 04:36:20 crc kubenswrapper[5072]: I0228 04:36:20.108299 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" containerID="cri-o://0e386676382b0c8efe27f89db4a4d5fa8b4258c234eaa3b9c041a33c00eda272" gracePeriod=600 Feb 28 04:36:20 crc kubenswrapper[5072]: E0228 04:36:20.230076 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lrpf_openshift-machine-config-operator(a035bbab-1d8f-4120-aaf7-88984d936939)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" Feb 28 04:36:20 crc kubenswrapper[5072]: I0228 04:36:20.893259 5072 generic.go:334] "Generic (PLEG): container finished" podID="a035bbab-1d8f-4120-aaf7-88984d936939" containerID="0e386676382b0c8efe27f89db4a4d5fa8b4258c234eaa3b9c041a33c00eda272" exitCode=0 Feb 28 04:36:20 crc kubenswrapper[5072]: I0228 04:36:20.893298 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" event={"ID":"a035bbab-1d8f-4120-aaf7-88984d936939","Type":"ContainerDied","Data":"0e386676382b0c8efe27f89db4a4d5fa8b4258c234eaa3b9c041a33c00eda272"} Feb 28 04:36:20 crc kubenswrapper[5072]: I0228 04:36:20.893374 5072 scope.go:117] "RemoveContainer" containerID="27d2fb4f87a04571b7b0a9792f832a0142945d828b2f05c5af46a4307532ae67" Feb 28 04:36:20 crc kubenswrapper[5072]: I0228 04:36:20.894049 5072 scope.go:117] "RemoveContainer" containerID="0e386676382b0c8efe27f89db4a4d5fa8b4258c234eaa3b9c041a33c00eda272" Feb 28 04:36:20 crc kubenswrapper[5072]: E0228 04:36:20.894354 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lrpf_openshift-machine-config-operator(a035bbab-1d8f-4120-aaf7-88984d936939)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" Feb 28 04:36:21 crc kubenswrapper[5072]: I0228 04:36:21.527755 5072 scope.go:117] "RemoveContainer" containerID="f98bb7ff081c8fd475cb4cb229f88a28dfc46cf3835b5762e73a883639c61235" Feb 28 04:36:32 crc kubenswrapper[5072]: I0228 04:36:32.659280 5072 scope.go:117] "RemoveContainer" containerID="0e386676382b0c8efe27f89db4a4d5fa8b4258c234eaa3b9c041a33c00eda272" Feb 28 04:36:32 crc kubenswrapper[5072]: E0228 04:36:32.660151 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lrpf_openshift-machine-config-operator(a035bbab-1d8f-4120-aaf7-88984d936939)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" Feb 28 04:36:44 crc kubenswrapper[5072]: I0228 04:36:44.659066 5072 scope.go:117] "RemoveContainer" containerID="0e386676382b0c8efe27f89db4a4d5fa8b4258c234eaa3b9c041a33c00eda272" Feb 28 04:36:44 crc kubenswrapper[5072]: E0228 04:36:44.659676 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lrpf_openshift-machine-config-operator(a035bbab-1d8f-4120-aaf7-88984d936939)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" Feb 28 04:36:57 crc kubenswrapper[5072]: I0228 04:36:57.658751 5072 scope.go:117] "RemoveContainer" containerID="0e386676382b0c8efe27f89db4a4d5fa8b4258c234eaa3b9c041a33c00eda272" Feb 28 04:36:57 crc kubenswrapper[5072]: E0228 04:36:57.659545 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lrpf_openshift-machine-config-operator(a035bbab-1d8f-4120-aaf7-88984d936939)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" Feb 28 04:37:10 crc kubenswrapper[5072]: I0228 04:37:10.659193 5072 scope.go:117] "RemoveContainer" containerID="0e386676382b0c8efe27f89db4a4d5fa8b4258c234eaa3b9c041a33c00eda272" Feb 28 04:37:10 crc kubenswrapper[5072]: E0228 04:37:10.659861 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lrpf_openshift-machine-config-operator(a035bbab-1d8f-4120-aaf7-88984d936939)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" Feb 28 04:37:21 crc kubenswrapper[5072]: I0228 04:37:21.612550 5072 scope.go:117] "RemoveContainer" containerID="6cda1568b5fdb000bfdf309b5a826a54d00f75e7aa1101d47c352ec97792f81f" Feb 28 04:37:21 crc kubenswrapper[5072]: I0228 04:37:21.759893 5072 scope.go:117] "RemoveContainer" containerID="6188312132e61e01ceba9028d903a90f7e8327ac232362763ecfa9f41ffcfd93" Feb 28 04:37:23 crc kubenswrapper[5072]: I0228 04:37:23.659115 5072 scope.go:117] "RemoveContainer" containerID="0e386676382b0c8efe27f89db4a4d5fa8b4258c234eaa3b9c041a33c00eda272" Feb 28 04:37:23 crc kubenswrapper[5072]: E0228 04:37:23.659686 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lrpf_openshift-machine-config-operator(a035bbab-1d8f-4120-aaf7-88984d936939)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" Feb 28 04:37:37 crc kubenswrapper[5072]: I0228 04:37:37.659387 5072 scope.go:117] "RemoveContainer" containerID="0e386676382b0c8efe27f89db4a4d5fa8b4258c234eaa3b9c041a33c00eda272" Feb 28 04:37:37 crc kubenswrapper[5072]: E0228 04:37:37.659883 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lrpf_openshift-machine-config-operator(a035bbab-1d8f-4120-aaf7-88984d936939)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" Feb 28 04:37:49 crc kubenswrapper[5072]: I0228 04:37:49.490798 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2trmj/must-gather-rftc5"] Feb 28 04:37:49 crc kubenswrapper[5072]: E0228 04:37:49.492965 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f27be646-2a93-4034-bde3-2aeebf9bbc81" containerName="extract-content" Feb 28 04:37:49 crc kubenswrapper[5072]: I0228 04:37:49.493124 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27be646-2a93-4034-bde3-2aeebf9bbc81" containerName="extract-content" Feb 28 04:37:49 crc kubenswrapper[5072]: E0228 04:37:49.493221 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24f274e3-9468-4418-9b22-ccda48145e19" containerName="oc" Feb 28 04:37:49 crc kubenswrapper[5072]: I0228 04:37:49.493295 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="24f274e3-9468-4418-9b22-ccda48145e19" containerName="oc" Feb 28 04:37:49 crc kubenswrapper[5072]: E0228 04:37:49.493371 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f27be646-2a93-4034-bde3-2aeebf9bbc81" containerName="registry-server" Feb 28 04:37:49 crc kubenswrapper[5072]: I0228 04:37:49.493445 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27be646-2a93-4034-bde3-2aeebf9bbc81" containerName="registry-server" Feb 28 04:37:49 crc kubenswrapper[5072]: E0228 04:37:49.493528 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f27be646-2a93-4034-bde3-2aeebf9bbc81" containerName="extract-utilities" Feb 28 04:37:49 crc kubenswrapper[5072]: I0228 04:37:49.493611 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27be646-2a93-4034-bde3-2aeebf9bbc81" containerName="extract-utilities" Feb 28 04:37:49 crc kubenswrapper[5072]: I0228 04:37:49.493866 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="f27be646-2a93-4034-bde3-2aeebf9bbc81" containerName="registry-server" Feb 28 04:37:49 crc kubenswrapper[5072]: I0228 04:37:49.493955 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="24f274e3-9468-4418-9b22-ccda48145e19" containerName="oc" Feb 28 04:37:49 crc kubenswrapper[5072]: I0228 04:37:49.494732 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2trmj/must-gather-rftc5" Feb 28 04:37:49 crc kubenswrapper[5072]: I0228 04:37:49.497951 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-2trmj"/"openshift-service-ca.crt" Feb 28 04:37:49 crc kubenswrapper[5072]: I0228 04:37:49.498242 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-2trmj"/"default-dockercfg-8n47c" Feb 28 04:37:49 crc kubenswrapper[5072]: I0228 04:37:49.498337 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-2trmj"/"kube-root-ca.crt" Feb 28 04:37:49 crc kubenswrapper[5072]: I0228 04:37:49.498677 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2trmj/must-gather-rftc5"] Feb 28 04:37:49 crc kubenswrapper[5072]: I0228 04:37:49.659468 5072 scope.go:117] "RemoveContainer" containerID="0e386676382b0c8efe27f89db4a4d5fa8b4258c234eaa3b9c041a33c00eda272" Feb 28 04:37:49 crc kubenswrapper[5072]: E0228 04:37:49.659902 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lrpf_openshift-machine-config-operator(a035bbab-1d8f-4120-aaf7-88984d936939)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" Feb 28 04:37:49 crc kubenswrapper[5072]: I0228 04:37:49.688093 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/82b82932-4d4c-4728-8067-7dff076dac55-must-gather-output\") pod \"must-gather-rftc5\" (UID: \"82b82932-4d4c-4728-8067-7dff076dac55\") " pod="openshift-must-gather-2trmj/must-gather-rftc5" Feb 28 04:37:49 crc kubenswrapper[5072]: I0228 04:37:49.688154 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzvmv\" (UniqueName: \"kubernetes.io/projected/82b82932-4d4c-4728-8067-7dff076dac55-kube-api-access-kzvmv\") pod \"must-gather-rftc5\" (UID: \"82b82932-4d4c-4728-8067-7dff076dac55\") " pod="openshift-must-gather-2trmj/must-gather-rftc5" Feb 28 04:37:49 crc kubenswrapper[5072]: I0228 04:37:49.789158 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/82b82932-4d4c-4728-8067-7dff076dac55-must-gather-output\") pod \"must-gather-rftc5\" (UID: \"82b82932-4d4c-4728-8067-7dff076dac55\") " pod="openshift-must-gather-2trmj/must-gather-rftc5" Feb 28 04:37:49 crc kubenswrapper[5072]: I0228 04:37:49.789204 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzvmv\" (UniqueName: \"kubernetes.io/projected/82b82932-4d4c-4728-8067-7dff076dac55-kube-api-access-kzvmv\") pod \"must-gather-rftc5\" (UID: \"82b82932-4d4c-4728-8067-7dff076dac55\") " pod="openshift-must-gather-2trmj/must-gather-rftc5" Feb 28 04:37:49 crc kubenswrapper[5072]: I0228 04:37:49.789679 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/82b82932-4d4c-4728-8067-7dff076dac55-must-gather-output\") pod \"must-gather-rftc5\" (UID: \"82b82932-4d4c-4728-8067-7dff076dac55\") " pod="openshift-must-gather-2trmj/must-gather-rftc5" Feb 28 04:37:49 crc kubenswrapper[5072]: I0228 04:37:49.807221 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzvmv\" (UniqueName: \"kubernetes.io/projected/82b82932-4d4c-4728-8067-7dff076dac55-kube-api-access-kzvmv\") pod \"must-gather-rftc5\" (UID: \"82b82932-4d4c-4728-8067-7dff076dac55\") " pod="openshift-must-gather-2trmj/must-gather-rftc5" Feb 28 04:37:49 crc kubenswrapper[5072]: I0228 04:37:49.814081 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2trmj/must-gather-rftc5" Feb 28 04:37:50 crc kubenswrapper[5072]: I0228 04:37:50.028845 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2trmj/must-gather-rftc5"] Feb 28 04:37:50 crc kubenswrapper[5072]: I0228 04:37:50.739710 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2trmj/must-gather-rftc5" event={"ID":"82b82932-4d4c-4728-8067-7dff076dac55","Type":"ContainerStarted","Data":"011e80d0f5ed633cd8045e64ee716a2f8d57b5513ba7c27fb9dd3c4df08a3807"} Feb 28 04:37:50 crc kubenswrapper[5072]: I0228 04:37:50.740157 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2trmj/must-gather-rftc5" event={"ID":"82b82932-4d4c-4728-8067-7dff076dac55","Type":"ContainerStarted","Data":"a3c3a2d4af1943b4016757b187115bfc811a315c69b3caa937326ffc4c3d7e08"} Feb 28 04:37:50 crc kubenswrapper[5072]: I0228 04:37:50.740176 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2trmj/must-gather-rftc5" event={"ID":"82b82932-4d4c-4728-8067-7dff076dac55","Type":"ContainerStarted","Data":"31a2663b0fec57e37c16df543f04c6b72c7cecca1f397bef27ef883ada315892"} Feb 28 04:37:50 crc kubenswrapper[5072]: I0228 04:37:50.761628 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2trmj/must-gather-rftc5" podStartSLOduration=1.7616046760000001 podStartE2EDuration="1.761604676s" podCreationTimestamp="2026-02-28 04:37:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:37:50.756234539 +0000 UTC m=+1692.750964741" watchObservedRunningTime="2026-02-28 04:37:50.761604676 +0000 UTC m=+1692.756334868" Feb 28 04:38:00 crc kubenswrapper[5072]: I0228 04:38:00.127832 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537558-dbqw5"] Feb 28 04:38:00 crc kubenswrapper[5072]: I0228 04:38:00.129247 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537558-dbqw5" Feb 28 04:38:00 crc kubenswrapper[5072]: I0228 04:38:00.131503 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-c8kvx" Feb 28 04:38:00 crc kubenswrapper[5072]: I0228 04:38:00.132187 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 04:38:00 crc kubenswrapper[5072]: I0228 04:38:00.132235 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 04:38:00 crc kubenswrapper[5072]: I0228 04:38:00.135763 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537558-dbqw5"] Feb 28 04:38:00 crc kubenswrapper[5072]: I0228 04:38:00.315155 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjwgl\" (UniqueName: \"kubernetes.io/projected/ea5b72ea-d4d2-40bd-a15e-f7baad745f88-kube-api-access-pjwgl\") pod \"auto-csr-approver-29537558-dbqw5\" (UID: \"ea5b72ea-d4d2-40bd-a15e-f7baad745f88\") " pod="openshift-infra/auto-csr-approver-29537558-dbqw5" Feb 28 04:38:00 crc kubenswrapper[5072]: I0228 04:38:00.416358 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjwgl\" (UniqueName: \"kubernetes.io/projected/ea5b72ea-d4d2-40bd-a15e-f7baad745f88-kube-api-access-pjwgl\") pod \"auto-csr-approver-29537558-dbqw5\" (UID: \"ea5b72ea-d4d2-40bd-a15e-f7baad745f88\") " pod="openshift-infra/auto-csr-approver-29537558-dbqw5" Feb 28 04:38:00 crc kubenswrapper[5072]: I0228 04:38:00.434668 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjwgl\" (UniqueName: \"kubernetes.io/projected/ea5b72ea-d4d2-40bd-a15e-f7baad745f88-kube-api-access-pjwgl\") pod \"auto-csr-approver-29537558-dbqw5\" (UID: \"ea5b72ea-d4d2-40bd-a15e-f7baad745f88\") " pod="openshift-infra/auto-csr-approver-29537558-dbqw5" Feb 28 04:38:00 crc kubenswrapper[5072]: I0228 04:38:00.450353 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537558-dbqw5" Feb 28 04:38:00 crc kubenswrapper[5072]: I0228 04:38:00.658883 5072 scope.go:117] "RemoveContainer" containerID="0e386676382b0c8efe27f89db4a4d5fa8b4258c234eaa3b9c041a33c00eda272" Feb 28 04:38:00 crc kubenswrapper[5072]: E0228 04:38:00.659432 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lrpf_openshift-machine-config-operator(a035bbab-1d8f-4120-aaf7-88984d936939)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" Feb 28 04:38:00 crc kubenswrapper[5072]: I0228 04:38:00.831891 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537558-dbqw5"] Feb 28 04:38:00 crc kubenswrapper[5072]: I0228 04:38:00.837278 5072 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 28 04:38:01 crc kubenswrapper[5072]: I0228 04:38:01.792401 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537558-dbqw5" event={"ID":"ea5b72ea-d4d2-40bd-a15e-f7baad745f88","Type":"ContainerStarted","Data":"aafb9e239abb9c3fecaccf5266865483e866bcdda753cf05b7bec87ab497169b"} Feb 28 04:38:02 crc kubenswrapper[5072]: I0228 04:38:02.798394 5072 generic.go:334] "Generic (PLEG): container finished" podID="ea5b72ea-d4d2-40bd-a15e-f7baad745f88" containerID="979560e5bcfc8e486bcf3571dfbc647a1148f4ae7d998e920da66ae49174bad5" exitCode=0 Feb 28 04:38:02 crc kubenswrapper[5072]: I0228 04:38:02.798582 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537558-dbqw5" event={"ID":"ea5b72ea-d4d2-40bd-a15e-f7baad745f88","Type":"ContainerDied","Data":"979560e5bcfc8e486bcf3571dfbc647a1148f4ae7d998e920da66ae49174bad5"} Feb 28 04:38:04 crc kubenswrapper[5072]: I0228 04:38:04.055704 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537558-dbqw5" Feb 28 04:38:04 crc kubenswrapper[5072]: I0228 04:38:04.165150 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjwgl\" (UniqueName: \"kubernetes.io/projected/ea5b72ea-d4d2-40bd-a15e-f7baad745f88-kube-api-access-pjwgl\") pod \"ea5b72ea-d4d2-40bd-a15e-f7baad745f88\" (UID: \"ea5b72ea-d4d2-40bd-a15e-f7baad745f88\") " Feb 28 04:38:04 crc kubenswrapper[5072]: I0228 04:38:04.171787 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea5b72ea-d4d2-40bd-a15e-f7baad745f88-kube-api-access-pjwgl" (OuterVolumeSpecName: "kube-api-access-pjwgl") pod "ea5b72ea-d4d2-40bd-a15e-f7baad745f88" (UID: "ea5b72ea-d4d2-40bd-a15e-f7baad745f88"). InnerVolumeSpecName "kube-api-access-pjwgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:38:04 crc kubenswrapper[5072]: I0228 04:38:04.266926 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjwgl\" (UniqueName: \"kubernetes.io/projected/ea5b72ea-d4d2-40bd-a15e-f7baad745f88-kube-api-access-pjwgl\") on node \"crc\" DevicePath \"\"" Feb 28 04:38:04 crc kubenswrapper[5072]: I0228 04:38:04.812497 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537558-dbqw5" event={"ID":"ea5b72ea-d4d2-40bd-a15e-f7baad745f88","Type":"ContainerDied","Data":"aafb9e239abb9c3fecaccf5266865483e866bcdda753cf05b7bec87ab497169b"} Feb 28 04:38:04 crc kubenswrapper[5072]: I0228 04:38:04.812533 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aafb9e239abb9c3fecaccf5266865483e866bcdda753cf05b7bec87ab497169b" Feb 28 04:38:04 crc kubenswrapper[5072]: I0228 04:38:04.812534 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537558-dbqw5" Feb 28 04:38:05 crc kubenswrapper[5072]: I0228 04:38:05.113768 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537552-ms869"] Feb 28 04:38:05 crc kubenswrapper[5072]: I0228 04:38:05.118289 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537552-ms869"] Feb 28 04:38:06 crc kubenswrapper[5072]: I0228 04:38:06.667828 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28e1152f-ab56-472e-b5b2-d859aee63a9c" path="/var/lib/kubelet/pods/28e1152f-ab56-472e-b5b2-d859aee63a9c/volumes" Feb 28 04:38:15 crc kubenswrapper[5072]: I0228 04:38:15.659078 5072 scope.go:117] "RemoveContainer" containerID="0e386676382b0c8efe27f89db4a4d5fa8b4258c234eaa3b9c041a33c00eda272" Feb 28 04:38:15 crc kubenswrapper[5072]: E0228 04:38:15.660753 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lrpf_openshift-machine-config-operator(a035bbab-1d8f-4120-aaf7-88984d936939)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" Feb 28 04:38:21 crc kubenswrapper[5072]: I0228 04:38:21.796315 5072 scope.go:117] "RemoveContainer" containerID="dc88d0c53537db4a17fc1d8e453cb3b7175c49858db7696b95234399a7f86d10" Feb 28 04:38:27 crc kubenswrapper[5072]: I0228 04:38:27.659068 5072 scope.go:117] "RemoveContainer" containerID="0e386676382b0c8efe27f89db4a4d5fa8b4258c234eaa3b9c041a33c00eda272" Feb 28 04:38:27 crc kubenswrapper[5072]: E0228 04:38:27.660640 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lrpf_openshift-machine-config-operator(a035bbab-1d8f-4120-aaf7-88984d936939)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" Feb 28 04:38:34 crc kubenswrapper[5072]: I0228 04:38:34.076488 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-zc5mk_672db961-8de6-46ec-9dd8-5d2ef7572eef/control-plane-machine-set-operator/0.log" Feb 28 04:38:34 crc kubenswrapper[5072]: I0228 04:38:34.218342 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9vfgz_707bbe1d-eb9e-4d9d-8e70-e88429b8c077/kube-rbac-proxy/0.log" Feb 28 04:38:34 crc kubenswrapper[5072]: I0228 04:38:34.233662 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9vfgz_707bbe1d-eb9e-4d9d-8e70-e88429b8c077/machine-api-operator/0.log" Feb 28 04:38:40 crc kubenswrapper[5072]: I0228 04:38:40.659257 5072 scope.go:117] "RemoveContainer" containerID="0e386676382b0c8efe27f89db4a4d5fa8b4258c234eaa3b9c041a33c00eda272" Feb 28 04:38:40 crc kubenswrapper[5072]: E0228 04:38:40.660029 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lrpf_openshift-machine-config-operator(a035bbab-1d8f-4120-aaf7-88984d936939)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" Feb 28 04:38:55 crc kubenswrapper[5072]: I0228 04:38:55.659194 5072 scope.go:117] "RemoveContainer" containerID="0e386676382b0c8efe27f89db4a4d5fa8b4258c234eaa3b9c041a33c00eda272" Feb 28 04:38:55 crc kubenswrapper[5072]: E0228 04:38:55.659846 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lrpf_openshift-machine-config-operator(a035bbab-1d8f-4120-aaf7-88984d936939)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" Feb 28 04:39:04 crc kubenswrapper[5072]: I0228 04:39:04.066112 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-wf6c2_81c4d0e9-644c-4f99-af4b-0d73be068ca2/kube-rbac-proxy/0.log" Feb 28 04:39:04 crc kubenswrapper[5072]: I0228 04:39:04.129654 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-wf6c2_81c4d0e9-644c-4f99-af4b-0d73be068ca2/controller/0.log" Feb 28 04:39:04 crc kubenswrapper[5072]: I0228 04:39:04.268714 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/cp-frr-files/0.log" Feb 28 04:39:04 crc kubenswrapper[5072]: I0228 04:39:04.457706 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/cp-reloader/0.log" Feb 28 04:39:04 crc kubenswrapper[5072]: I0228 04:39:04.770533 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/cp-frr-files/0.log" Feb 28 04:39:04 crc kubenswrapper[5072]: I0228 04:39:04.909507 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/cp-reloader/0.log" Feb 28 04:39:05 crc kubenswrapper[5072]: I0228 04:39:05.061343 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/cp-metrics/0.log" Feb 28 04:39:05 crc kubenswrapper[5072]: I0228 04:39:05.211632 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/cp-metrics/0.log" Feb 28 04:39:05 crc kubenswrapper[5072]: I0228 04:39:05.398785 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/cp-frr-files/0.log" Feb 28 04:39:05 crc kubenswrapper[5072]: I0228 04:39:05.440984 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/cp-frr-files/0.log" Feb 28 04:39:05 crc kubenswrapper[5072]: I0228 04:39:05.446880 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/cp-metrics/0.log" Feb 28 04:39:05 crc kubenswrapper[5072]: I0228 04:39:05.488658 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/cp-reloader/0.log" Feb 28 04:39:05 crc kubenswrapper[5072]: I0228 04:39:05.556196 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/cp-reloader/0.log" Feb 28 04:39:05 crc kubenswrapper[5072]: I0228 04:39:05.602302 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/cp-metrics/0.log" Feb 28 04:39:05 crc kubenswrapper[5072]: I0228 04:39:05.630838 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/controller/0.log" Feb 28 04:39:05 crc kubenswrapper[5072]: I0228 04:39:05.802258 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/frr-metrics/0.log" Feb 28 04:39:05 crc kubenswrapper[5072]: I0228 04:39:05.813913 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/kube-rbac-proxy/0.log" Feb 28 04:39:05 crc kubenswrapper[5072]: I0228 04:39:05.826858 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/kube-rbac-proxy-frr/0.log" Feb 28 04:39:05 crc kubenswrapper[5072]: I0228 04:39:05.973136 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/frr/0.log" Feb 28 04:39:05 crc kubenswrapper[5072]: I0228 04:39:05.976978 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l4kss_fc3e2173-5582-4edb-b330-fb46053b22e2/reloader/0.log" Feb 28 04:39:05 crc kubenswrapper[5072]: I0228 04:39:05.979574 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-6mp9v_9783d250-c2b9-4e29-a8d7-94d92d301478/frr-k8s-webhook-server/0.log" Feb 28 04:39:06 crc kubenswrapper[5072]: I0228 04:39:06.158393 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-85768d6f57-5rpmg_66660768-8bc9-40af-baab-529d0820c10b/manager/0.log" Feb 28 04:39:06 crc kubenswrapper[5072]: I0228 04:39:06.227896 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6b95579fd-hmq5d_67512bfe-55b8-4df0-aa98-54225fc624a3/webhook-server/0.log" Feb 28 04:39:06 crc kubenswrapper[5072]: I0228 04:39:06.369148 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vlqxq_7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c/kube-rbac-proxy/0.log" Feb 28 04:39:06 crc kubenswrapper[5072]: I0228 04:39:06.448805 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vlqxq_7a3ec2ec-a764-4325-8c9a-d2a7a9492c8c/speaker/0.log" Feb 28 04:39:08 crc kubenswrapper[5072]: I0228 04:39:08.663333 5072 scope.go:117] "RemoveContainer" containerID="0e386676382b0c8efe27f89db4a4d5fa8b4258c234eaa3b9c041a33c00eda272" Feb 28 04:39:08 crc kubenswrapper[5072]: E0228 04:39:08.663933 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lrpf_openshift-machine-config-operator(a035bbab-1d8f-4120-aaf7-88984d936939)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" Feb 28 04:39:22 crc kubenswrapper[5072]: I0228 04:39:22.658998 5072 scope.go:117] "RemoveContainer" containerID="0e386676382b0c8efe27f89db4a4d5fa8b4258c234eaa3b9c041a33c00eda272" Feb 28 04:39:22 crc kubenswrapper[5072]: E0228 04:39:22.659788 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lrpf_openshift-machine-config-operator(a035bbab-1d8f-4120-aaf7-88984d936939)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" Feb 28 04:39:28 crc kubenswrapper[5072]: I0228 04:39:28.221456 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xzs42_16857e88-4eaa-40bb-86cb-04fd3da8babe/extract-utilities/0.log" Feb 28 04:39:28 crc kubenswrapper[5072]: I0228 04:39:28.378221 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xzs42_16857e88-4eaa-40bb-86cb-04fd3da8babe/extract-utilities/0.log" Feb 28 04:39:28 crc kubenswrapper[5072]: I0228 04:39:28.414793 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xzs42_16857e88-4eaa-40bb-86cb-04fd3da8babe/extract-content/0.log" Feb 28 04:39:28 crc kubenswrapper[5072]: I0228 04:39:28.445202 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xzs42_16857e88-4eaa-40bb-86cb-04fd3da8babe/extract-content/0.log" Feb 28 04:39:28 crc kubenswrapper[5072]: I0228 04:39:28.600462 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xzs42_16857e88-4eaa-40bb-86cb-04fd3da8babe/extract-content/0.log" Feb 28 04:39:28 crc kubenswrapper[5072]: I0228 04:39:28.625557 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xzs42_16857e88-4eaa-40bb-86cb-04fd3da8babe/extract-utilities/0.log" Feb 28 04:39:28 crc kubenswrapper[5072]: I0228 04:39:28.778711 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6qf7q_6dbe0794-9375-4056-98bf-7ae9f9f10093/extract-utilities/0.log" Feb 28 04:39:28 crc kubenswrapper[5072]: I0228 04:39:28.949672 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xzs42_16857e88-4eaa-40bb-86cb-04fd3da8babe/registry-server/0.log" Feb 28 04:39:29 crc kubenswrapper[5072]: I0228 04:39:29.010014 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6qf7q_6dbe0794-9375-4056-98bf-7ae9f9f10093/extract-content/0.log" Feb 28 04:39:29 crc kubenswrapper[5072]: I0228 04:39:29.017047 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6qf7q_6dbe0794-9375-4056-98bf-7ae9f9f10093/extract-content/0.log" Feb 28 04:39:29 crc kubenswrapper[5072]: I0228 04:39:29.034574 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6qf7q_6dbe0794-9375-4056-98bf-7ae9f9f10093/extract-utilities/0.log" Feb 28 04:39:29 crc kubenswrapper[5072]: I0228 04:39:29.185489 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6qf7q_6dbe0794-9375-4056-98bf-7ae9f9f10093/extract-utilities/0.log" Feb 28 04:39:29 crc kubenswrapper[5072]: I0228 04:39:29.199814 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6qf7q_6dbe0794-9375-4056-98bf-7ae9f9f10093/extract-content/0.log" Feb 28 04:39:29 crc kubenswrapper[5072]: I0228 04:39:29.359082 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_c8b2c327-09b5-479e-b2e9-8edb01862f59/util/0.log" Feb 28 04:39:29 crc kubenswrapper[5072]: I0228 04:39:29.527961 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6qf7q_6dbe0794-9375-4056-98bf-7ae9f9f10093/registry-server/0.log" Feb 28 04:39:29 crc kubenswrapper[5072]: I0228 04:39:29.660415 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_c8b2c327-09b5-479e-b2e9-8edb01862f59/pull/0.log" Feb 28 04:39:29 crc kubenswrapper[5072]: I0228 04:39:29.673197 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_c8b2c327-09b5-479e-b2e9-8edb01862f59/pull/0.log" Feb 28 04:39:29 crc kubenswrapper[5072]: I0228 04:39:29.791135 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_c8b2c327-09b5-479e-b2e9-8edb01862f59/util/0.log" Feb 28 04:39:29 crc kubenswrapper[5072]: I0228 04:39:29.863549 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_c8b2c327-09b5-479e-b2e9-8edb01862f59/util/0.log" Feb 28 04:39:29 crc kubenswrapper[5072]: I0228 04:39:29.881531 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_c8b2c327-09b5-479e-b2e9-8edb01862f59/extract/0.log" Feb 28 04:39:29 crc kubenswrapper[5072]: I0228 04:39:29.925061 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4k2dlt_c8b2c327-09b5-479e-b2e9-8edb01862f59/pull/0.log" Feb 28 04:39:30 crc kubenswrapper[5072]: I0228 04:39:30.021200 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-jxjmq_cf5cb269-db5d-4b8d-ba70-a583c95dd586/marketplace-operator/0.log" Feb 28 04:39:30 crc kubenswrapper[5072]: I0228 04:39:30.107906 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vrkw_df13e8d9-b1b8-4ed6-b16a-543fe5b71d46/extract-utilities/0.log" Feb 28 04:39:30 crc kubenswrapper[5072]: I0228 04:39:30.232573 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vrkw_df13e8d9-b1b8-4ed6-b16a-543fe5b71d46/extract-utilities/0.log" Feb 28 04:39:30 crc kubenswrapper[5072]: I0228 04:39:30.262171 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vrkw_df13e8d9-b1b8-4ed6-b16a-543fe5b71d46/extract-content/0.log" Feb 28 04:39:30 crc kubenswrapper[5072]: I0228 04:39:30.265515 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vrkw_df13e8d9-b1b8-4ed6-b16a-543fe5b71d46/extract-content/0.log" Feb 28 04:39:30 crc kubenswrapper[5072]: I0228 04:39:30.379879 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vrkw_df13e8d9-b1b8-4ed6-b16a-543fe5b71d46/extract-utilities/0.log" Feb 28 04:39:30 crc kubenswrapper[5072]: I0228 04:39:30.404847 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vrkw_df13e8d9-b1b8-4ed6-b16a-543fe5b71d46/extract-content/0.log" Feb 28 04:39:30 crc kubenswrapper[5072]: I0228 04:39:30.526074 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-64jqj_4fa349e7-fe1e-47f4-80bd-7d0e1bf55719/extract-utilities/0.log" Feb 28 04:39:30 crc kubenswrapper[5072]: I0228 04:39:30.537848 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vrkw_df13e8d9-b1b8-4ed6-b16a-543fe5b71d46/registry-server/0.log" Feb 28 04:39:30 crc kubenswrapper[5072]: I0228 04:39:30.721759 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-64jqj_4fa349e7-fe1e-47f4-80bd-7d0e1bf55719/extract-utilities/0.log" Feb 28 04:39:30 crc kubenswrapper[5072]: I0228 04:39:30.739819 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-64jqj_4fa349e7-fe1e-47f4-80bd-7d0e1bf55719/extract-content/0.log" Feb 28 04:39:30 crc kubenswrapper[5072]: I0228 04:39:30.742563 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-64jqj_4fa349e7-fe1e-47f4-80bd-7d0e1bf55719/extract-content/0.log" Feb 28 04:39:30 crc kubenswrapper[5072]: I0228 04:39:30.926832 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-64jqj_4fa349e7-fe1e-47f4-80bd-7d0e1bf55719/extract-content/0.log" Feb 28 04:39:30 crc kubenswrapper[5072]: I0228 04:39:30.936354 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-64jqj_4fa349e7-fe1e-47f4-80bd-7d0e1bf55719/extract-utilities/0.log" Feb 28 04:39:31 crc kubenswrapper[5072]: I0228 04:39:31.241335 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-64jqj_4fa349e7-fe1e-47f4-80bd-7d0e1bf55719/registry-server/0.log" Feb 28 04:39:33 crc kubenswrapper[5072]: I0228 04:39:33.658862 5072 scope.go:117] "RemoveContainer" containerID="0e386676382b0c8efe27f89db4a4d5fa8b4258c234eaa3b9c041a33c00eda272" Feb 28 04:39:33 crc kubenswrapper[5072]: E0228 04:39:33.659053 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lrpf_openshift-machine-config-operator(a035bbab-1d8f-4120-aaf7-88984d936939)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" Feb 28 04:39:47 crc kubenswrapper[5072]: I0228 04:39:47.658557 5072 scope.go:117] "RemoveContainer" containerID="0e386676382b0c8efe27f89db4a4d5fa8b4258c234eaa3b9c041a33c00eda272" Feb 28 04:39:47 crc kubenswrapper[5072]: E0228 04:39:47.659230 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lrpf_openshift-machine-config-operator(a035bbab-1d8f-4120-aaf7-88984d936939)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" Feb 28 04:39:59 crc kubenswrapper[5072]: I0228 04:39:59.659441 5072 scope.go:117] "RemoveContainer" containerID="0e386676382b0c8efe27f89db4a4d5fa8b4258c234eaa3b9c041a33c00eda272" Feb 28 04:39:59 crc kubenswrapper[5072]: E0228 04:39:59.660571 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lrpf_openshift-machine-config-operator(a035bbab-1d8f-4120-aaf7-88984d936939)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" Feb 28 04:40:00 crc kubenswrapper[5072]: I0228 04:40:00.129396 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537560-lwv46"] Feb 28 04:40:00 crc kubenswrapper[5072]: E0228 04:40:00.129676 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea5b72ea-d4d2-40bd-a15e-f7baad745f88" containerName="oc" Feb 28 04:40:00 crc kubenswrapper[5072]: I0228 04:40:00.129691 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea5b72ea-d4d2-40bd-a15e-f7baad745f88" containerName="oc" Feb 28 04:40:00 crc kubenswrapper[5072]: I0228 04:40:00.129830 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea5b72ea-d4d2-40bd-a15e-f7baad745f88" containerName="oc" Feb 28 04:40:00 crc kubenswrapper[5072]: I0228 04:40:00.132034 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537560-lwv46" Feb 28 04:40:00 crc kubenswrapper[5072]: I0228 04:40:00.135787 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-c8kvx" Feb 28 04:40:00 crc kubenswrapper[5072]: I0228 04:40:00.135792 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 04:40:00 crc kubenswrapper[5072]: I0228 04:40:00.136468 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 04:40:00 crc kubenswrapper[5072]: I0228 04:40:00.156270 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537560-lwv46"] Feb 28 04:40:00 crc kubenswrapper[5072]: I0228 04:40:00.306359 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgcnb\" (UniqueName: \"kubernetes.io/projected/45333a76-d741-43ad-9839-04e95a9e22f3-kube-api-access-kgcnb\") pod \"auto-csr-approver-29537560-lwv46\" (UID: \"45333a76-d741-43ad-9839-04e95a9e22f3\") " pod="openshift-infra/auto-csr-approver-29537560-lwv46" Feb 28 04:40:00 crc kubenswrapper[5072]: I0228 04:40:00.407892 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgcnb\" (UniqueName: \"kubernetes.io/projected/45333a76-d741-43ad-9839-04e95a9e22f3-kube-api-access-kgcnb\") pod \"auto-csr-approver-29537560-lwv46\" (UID: \"45333a76-d741-43ad-9839-04e95a9e22f3\") " pod="openshift-infra/auto-csr-approver-29537560-lwv46" Feb 28 04:40:00 crc kubenswrapper[5072]: I0228 04:40:00.424476 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgcnb\" (UniqueName: \"kubernetes.io/projected/45333a76-d741-43ad-9839-04e95a9e22f3-kube-api-access-kgcnb\") pod \"auto-csr-approver-29537560-lwv46\" (UID: \"45333a76-d741-43ad-9839-04e95a9e22f3\") " pod="openshift-infra/auto-csr-approver-29537560-lwv46" Feb 28 04:40:00 crc kubenswrapper[5072]: I0228 04:40:00.516279 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537560-lwv46" Feb 28 04:40:01 crc kubenswrapper[5072]: I0228 04:40:01.088187 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537560-lwv46"] Feb 28 04:40:01 crc kubenswrapper[5072]: I0228 04:40:01.530426 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537560-lwv46" event={"ID":"45333a76-d741-43ad-9839-04e95a9e22f3","Type":"ContainerStarted","Data":"57171b8baaad62867ab5eec98f1f1cde37b96e5ddbc7c541ac5b083ad3bac9c6"} Feb 28 04:40:03 crc kubenswrapper[5072]: I0228 04:40:03.555437 5072 generic.go:334] "Generic (PLEG): container finished" podID="45333a76-d741-43ad-9839-04e95a9e22f3" containerID="f0180f734e13a11df8665d700ef61166a8276273515e14b3b5fde15ec3f638b9" exitCode=0 Feb 28 04:40:03 crc kubenswrapper[5072]: I0228 04:40:03.555740 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537560-lwv46" event={"ID":"45333a76-d741-43ad-9839-04e95a9e22f3","Type":"ContainerDied","Data":"f0180f734e13a11df8665d700ef61166a8276273515e14b3b5fde15ec3f638b9"} Feb 28 04:40:04 crc kubenswrapper[5072]: I0228 04:40:04.906436 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537560-lwv46" Feb 28 04:40:04 crc kubenswrapper[5072]: I0228 04:40:04.978670 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgcnb\" (UniqueName: \"kubernetes.io/projected/45333a76-d741-43ad-9839-04e95a9e22f3-kube-api-access-kgcnb\") pod \"45333a76-d741-43ad-9839-04e95a9e22f3\" (UID: \"45333a76-d741-43ad-9839-04e95a9e22f3\") " Feb 28 04:40:04 crc kubenswrapper[5072]: I0228 04:40:04.983096 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45333a76-d741-43ad-9839-04e95a9e22f3-kube-api-access-kgcnb" (OuterVolumeSpecName: "kube-api-access-kgcnb") pod "45333a76-d741-43ad-9839-04e95a9e22f3" (UID: "45333a76-d741-43ad-9839-04e95a9e22f3"). InnerVolumeSpecName "kube-api-access-kgcnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:40:05 crc kubenswrapper[5072]: I0228 04:40:05.080379 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgcnb\" (UniqueName: \"kubernetes.io/projected/45333a76-d741-43ad-9839-04e95a9e22f3-kube-api-access-kgcnb\") on node \"crc\" DevicePath \"\"" Feb 28 04:40:05 crc kubenswrapper[5072]: I0228 04:40:05.575506 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537560-lwv46" event={"ID":"45333a76-d741-43ad-9839-04e95a9e22f3","Type":"ContainerDied","Data":"57171b8baaad62867ab5eec98f1f1cde37b96e5ddbc7c541ac5b083ad3bac9c6"} Feb 28 04:40:05 crc kubenswrapper[5072]: I0228 04:40:05.575566 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57171b8baaad62867ab5eec98f1f1cde37b96e5ddbc7c541ac5b083ad3bac9c6" Feb 28 04:40:05 crc kubenswrapper[5072]: I0228 04:40:05.575635 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537560-lwv46" Feb 28 04:40:05 crc kubenswrapper[5072]: I0228 04:40:05.972803 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537554-dnn6z"] Feb 28 04:40:05 crc kubenswrapper[5072]: I0228 04:40:05.982341 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537554-dnn6z"] Feb 28 04:40:06 crc kubenswrapper[5072]: I0228 04:40:06.668862 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d76583e4-1977-4ec3-a097-b0e22f3569dc" path="/var/lib/kubelet/pods/d76583e4-1977-4ec3-a097-b0e22f3569dc/volumes" Feb 28 04:40:14 crc kubenswrapper[5072]: I0228 04:40:14.660808 5072 scope.go:117] "RemoveContainer" containerID="0e386676382b0c8efe27f89db4a4d5fa8b4258c234eaa3b9c041a33c00eda272" Feb 28 04:40:14 crc kubenswrapper[5072]: E0228 04:40:14.661683 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lrpf_openshift-machine-config-operator(a035bbab-1d8f-4120-aaf7-88984d936939)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" Feb 28 04:40:21 crc kubenswrapper[5072]: I0228 04:40:21.855766 5072 scope.go:117] "RemoveContainer" containerID="bb89b8819adc51354920ac27a8805b66feb8724a9eba2d0465da88f99456bab3" Feb 28 04:40:25 crc kubenswrapper[5072]: I0228 04:40:25.659629 5072 scope.go:117] "RemoveContainer" containerID="0e386676382b0c8efe27f89db4a4d5fa8b4258c234eaa3b9c041a33c00eda272" Feb 28 04:40:25 crc kubenswrapper[5072]: E0228 04:40:25.660349 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lrpf_openshift-machine-config-operator(a035bbab-1d8f-4120-aaf7-88984d936939)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" Feb 28 04:40:27 crc kubenswrapper[5072]: I0228 04:40:27.742313 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8mrsc"] Feb 28 04:40:27 crc kubenswrapper[5072]: E0228 04:40:27.742556 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45333a76-d741-43ad-9839-04e95a9e22f3" containerName="oc" Feb 28 04:40:27 crc kubenswrapper[5072]: I0228 04:40:27.742566 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="45333a76-d741-43ad-9839-04e95a9e22f3" containerName="oc" Feb 28 04:40:27 crc kubenswrapper[5072]: I0228 04:40:27.742719 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="45333a76-d741-43ad-9839-04e95a9e22f3" containerName="oc" Feb 28 04:40:27 crc kubenswrapper[5072]: I0228 04:40:27.745875 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8mrsc" Feb 28 04:40:27 crc kubenswrapper[5072]: I0228 04:40:27.752147 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mrsc"] Feb 28 04:40:27 crc kubenswrapper[5072]: I0228 04:40:27.805074 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/857a50de-2e46-43c1-b7c9-d0a825faea2f-utilities\") pod \"redhat-marketplace-8mrsc\" (UID: \"857a50de-2e46-43c1-b7c9-d0a825faea2f\") " pod="openshift-marketplace/redhat-marketplace-8mrsc" Feb 28 04:40:27 crc kubenswrapper[5072]: I0228 04:40:27.805131 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2smj\" (UniqueName: \"kubernetes.io/projected/857a50de-2e46-43c1-b7c9-d0a825faea2f-kube-api-access-w2smj\") pod \"redhat-marketplace-8mrsc\" (UID: \"857a50de-2e46-43c1-b7c9-d0a825faea2f\") " pod="openshift-marketplace/redhat-marketplace-8mrsc" Feb 28 04:40:27 crc kubenswrapper[5072]: I0228 04:40:27.805197 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/857a50de-2e46-43c1-b7c9-d0a825faea2f-catalog-content\") pod \"redhat-marketplace-8mrsc\" (UID: \"857a50de-2e46-43c1-b7c9-d0a825faea2f\") " pod="openshift-marketplace/redhat-marketplace-8mrsc" Feb 28 04:40:27 crc kubenswrapper[5072]: I0228 04:40:27.905960 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/857a50de-2e46-43c1-b7c9-d0a825faea2f-catalog-content\") pod \"redhat-marketplace-8mrsc\" (UID: \"857a50de-2e46-43c1-b7c9-d0a825faea2f\") " pod="openshift-marketplace/redhat-marketplace-8mrsc" Feb 28 04:40:27 crc kubenswrapper[5072]: I0228 04:40:27.906043 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/857a50de-2e46-43c1-b7c9-d0a825faea2f-utilities\") pod \"redhat-marketplace-8mrsc\" (UID: \"857a50de-2e46-43c1-b7c9-d0a825faea2f\") " pod="openshift-marketplace/redhat-marketplace-8mrsc" Feb 28 04:40:27 crc kubenswrapper[5072]: I0228 04:40:27.906079 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2smj\" (UniqueName: \"kubernetes.io/projected/857a50de-2e46-43c1-b7c9-d0a825faea2f-kube-api-access-w2smj\") pod \"redhat-marketplace-8mrsc\" (UID: \"857a50de-2e46-43c1-b7c9-d0a825faea2f\") " pod="openshift-marketplace/redhat-marketplace-8mrsc" Feb 28 04:40:27 crc kubenswrapper[5072]: I0228 04:40:27.906619 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/857a50de-2e46-43c1-b7c9-d0a825faea2f-utilities\") pod \"redhat-marketplace-8mrsc\" (UID: \"857a50de-2e46-43c1-b7c9-d0a825faea2f\") " pod="openshift-marketplace/redhat-marketplace-8mrsc" Feb 28 04:40:27 crc kubenswrapper[5072]: I0228 04:40:27.906938 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/857a50de-2e46-43c1-b7c9-d0a825faea2f-catalog-content\") pod \"redhat-marketplace-8mrsc\" (UID: \"857a50de-2e46-43c1-b7c9-d0a825faea2f\") " pod="openshift-marketplace/redhat-marketplace-8mrsc" Feb 28 04:40:27 crc kubenswrapper[5072]: I0228 04:40:27.929464 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2smj\" (UniqueName: \"kubernetes.io/projected/857a50de-2e46-43c1-b7c9-d0a825faea2f-kube-api-access-w2smj\") pod \"redhat-marketplace-8mrsc\" (UID: \"857a50de-2e46-43c1-b7c9-d0a825faea2f\") " pod="openshift-marketplace/redhat-marketplace-8mrsc" Feb 28 04:40:28 crc kubenswrapper[5072]: I0228 04:40:28.068885 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8mrsc" Feb 28 04:40:28 crc kubenswrapper[5072]: I0228 04:40:28.310416 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mrsc"] Feb 28 04:40:28 crc kubenswrapper[5072]: I0228 04:40:28.715323 5072 generic.go:334] "Generic (PLEG): container finished" podID="857a50de-2e46-43c1-b7c9-d0a825faea2f" containerID="53fd29aa1e9318291c8cb939cb1e83236815e6a0b204cc7125c5181e7a7fae92" exitCode=0 Feb 28 04:40:28 crc kubenswrapper[5072]: I0228 04:40:28.715823 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mrsc" event={"ID":"857a50de-2e46-43c1-b7c9-d0a825faea2f","Type":"ContainerDied","Data":"53fd29aa1e9318291c8cb939cb1e83236815e6a0b204cc7125c5181e7a7fae92"} Feb 28 04:40:28 crc kubenswrapper[5072]: I0228 04:40:28.717361 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mrsc" event={"ID":"857a50de-2e46-43c1-b7c9-d0a825faea2f","Type":"ContainerStarted","Data":"6b42818a62d41cececc9dfea644796aefbabc332e859e925b9857e702648196f"} Feb 28 04:40:29 crc kubenswrapper[5072]: I0228 04:40:29.948369 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-scxx8"] Feb 28 04:40:29 crc kubenswrapper[5072]: I0228 04:40:29.949628 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-scxx8" Feb 28 04:40:29 crc kubenswrapper[5072]: I0228 04:40:29.955263 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-scxx8"] Feb 28 04:40:30 crc kubenswrapper[5072]: I0228 04:40:30.034323 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5ea2b2b-7157-4e09-9318-095191b89acf-utilities\") pod \"certified-operators-scxx8\" (UID: \"b5ea2b2b-7157-4e09-9318-095191b89acf\") " pod="openshift-marketplace/certified-operators-scxx8" Feb 28 04:40:30 crc kubenswrapper[5072]: I0228 04:40:30.034402 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4v8z\" (UniqueName: \"kubernetes.io/projected/b5ea2b2b-7157-4e09-9318-095191b89acf-kube-api-access-m4v8z\") pod \"certified-operators-scxx8\" (UID: \"b5ea2b2b-7157-4e09-9318-095191b89acf\") " pod="openshift-marketplace/certified-operators-scxx8" Feb 28 04:40:30 crc kubenswrapper[5072]: I0228 04:40:30.034444 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5ea2b2b-7157-4e09-9318-095191b89acf-catalog-content\") pod \"certified-operators-scxx8\" (UID: \"b5ea2b2b-7157-4e09-9318-095191b89acf\") " pod="openshift-marketplace/certified-operators-scxx8" Feb 28 04:40:30 crc kubenswrapper[5072]: I0228 04:40:30.135912 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5ea2b2b-7157-4e09-9318-095191b89acf-utilities\") pod \"certified-operators-scxx8\" (UID: \"b5ea2b2b-7157-4e09-9318-095191b89acf\") " pod="openshift-marketplace/certified-operators-scxx8" Feb 28 04:40:30 crc kubenswrapper[5072]: I0228 04:40:30.135995 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4v8z\" (UniqueName: \"kubernetes.io/projected/b5ea2b2b-7157-4e09-9318-095191b89acf-kube-api-access-m4v8z\") pod \"certified-operators-scxx8\" (UID: \"b5ea2b2b-7157-4e09-9318-095191b89acf\") " pod="openshift-marketplace/certified-operators-scxx8" Feb 28 04:40:30 crc kubenswrapper[5072]: I0228 04:40:30.136062 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5ea2b2b-7157-4e09-9318-095191b89acf-catalog-content\") pod \"certified-operators-scxx8\" (UID: \"b5ea2b2b-7157-4e09-9318-095191b89acf\") " pod="openshift-marketplace/certified-operators-scxx8" Feb 28 04:40:30 crc kubenswrapper[5072]: I0228 04:40:30.136549 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5ea2b2b-7157-4e09-9318-095191b89acf-catalog-content\") pod \"certified-operators-scxx8\" (UID: \"b5ea2b2b-7157-4e09-9318-095191b89acf\") " pod="openshift-marketplace/certified-operators-scxx8" Feb 28 04:40:30 crc kubenswrapper[5072]: I0228 04:40:30.138355 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5ea2b2b-7157-4e09-9318-095191b89acf-utilities\") pod \"certified-operators-scxx8\" (UID: \"b5ea2b2b-7157-4e09-9318-095191b89acf\") " pod="openshift-marketplace/certified-operators-scxx8" Feb 28 04:40:30 crc kubenswrapper[5072]: I0228 04:40:30.172391 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4v8z\" (UniqueName: \"kubernetes.io/projected/b5ea2b2b-7157-4e09-9318-095191b89acf-kube-api-access-m4v8z\") pod \"certified-operators-scxx8\" (UID: \"b5ea2b2b-7157-4e09-9318-095191b89acf\") " pod="openshift-marketplace/certified-operators-scxx8" Feb 28 04:40:30 crc kubenswrapper[5072]: I0228 04:40:30.292698 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-scxx8" Feb 28 04:40:30 crc kubenswrapper[5072]: I0228 04:40:30.707428 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-scxx8"] Feb 28 04:40:30 crc kubenswrapper[5072]: I0228 04:40:30.727716 5072 generic.go:334] "Generic (PLEG): container finished" podID="857a50de-2e46-43c1-b7c9-d0a825faea2f" containerID="89852253b67a287501b856d2c1c1a260b90605b6128dd70f857ee6374e1c64e7" exitCode=0 Feb 28 04:40:30 crc kubenswrapper[5072]: I0228 04:40:30.727830 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mrsc" event={"ID":"857a50de-2e46-43c1-b7c9-d0a825faea2f","Type":"ContainerDied","Data":"89852253b67a287501b856d2c1c1a260b90605b6128dd70f857ee6374e1c64e7"} Feb 28 04:40:30 crc kubenswrapper[5072]: I0228 04:40:30.729056 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scxx8" event={"ID":"b5ea2b2b-7157-4e09-9318-095191b89acf","Type":"ContainerStarted","Data":"935fed2e0107175dcf2d4128e0b4e64dbbbea03bb5679ef6b8e07d80a72830f6"} Feb 28 04:40:31 crc kubenswrapper[5072]: I0228 04:40:31.737120 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mrsc" event={"ID":"857a50de-2e46-43c1-b7c9-d0a825faea2f","Type":"ContainerStarted","Data":"cfa0ca8c4286faee1ce864e47b56db1db48a007efbc2d6b44afcdbd30cc1d4c1"} Feb 28 04:40:31 crc kubenswrapper[5072]: I0228 04:40:31.740634 5072 generic.go:334] "Generic (PLEG): container finished" podID="b5ea2b2b-7157-4e09-9318-095191b89acf" containerID="d34294921678043e75ff8dba244ebc0176dcb696337a1c15a6e16cf78784d072" exitCode=0 Feb 28 04:40:31 crc kubenswrapper[5072]: I0228 04:40:31.740731 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scxx8" event={"ID":"b5ea2b2b-7157-4e09-9318-095191b89acf","Type":"ContainerDied","Data":"d34294921678043e75ff8dba244ebc0176dcb696337a1c15a6e16cf78784d072"} Feb 28 04:40:31 crc kubenswrapper[5072]: I0228 04:40:31.765952 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8mrsc" podStartSLOduration=2.032027895 podStartE2EDuration="4.765931679s" podCreationTimestamp="2026-02-28 04:40:27 +0000 UTC" firstStartedPulling="2026-02-28 04:40:28.7281724 +0000 UTC m=+1850.722902592" lastFinishedPulling="2026-02-28 04:40:31.462076174 +0000 UTC m=+1853.456806376" observedRunningTime="2026-02-28 04:40:31.758738605 +0000 UTC m=+1853.753468847" watchObservedRunningTime="2026-02-28 04:40:31.765931679 +0000 UTC m=+1853.760661891" Feb 28 04:40:32 crc kubenswrapper[5072]: I0228 04:40:32.747398 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scxx8" event={"ID":"b5ea2b2b-7157-4e09-9318-095191b89acf","Type":"ContainerStarted","Data":"cbd5eb7f815db901975ae3633faa4e7cfdc2b5d8ccbf17303be98ab8ee79b67c"} Feb 28 04:40:33 crc kubenswrapper[5072]: I0228 04:40:33.754699 5072 generic.go:334] "Generic (PLEG): container finished" podID="b5ea2b2b-7157-4e09-9318-095191b89acf" containerID="cbd5eb7f815db901975ae3633faa4e7cfdc2b5d8ccbf17303be98ab8ee79b67c" exitCode=0 Feb 28 04:40:33 crc kubenswrapper[5072]: I0228 04:40:33.754745 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scxx8" event={"ID":"b5ea2b2b-7157-4e09-9318-095191b89acf","Type":"ContainerDied","Data":"cbd5eb7f815db901975ae3633faa4e7cfdc2b5d8ccbf17303be98ab8ee79b67c"} Feb 28 04:40:34 crc kubenswrapper[5072]: I0228 04:40:34.764529 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scxx8" event={"ID":"b5ea2b2b-7157-4e09-9318-095191b89acf","Type":"ContainerStarted","Data":"fcbad3c31a6e13148771890e27c761a93aebcac5d77d5d08849b19b5b806bb10"} Feb 28 04:40:34 crc kubenswrapper[5072]: I0228 04:40:34.795660 5072 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-scxx8" podStartSLOduration=3.381462603 podStartE2EDuration="5.795616577s" podCreationTimestamp="2026-02-28 04:40:29 +0000 UTC" firstStartedPulling="2026-02-28 04:40:31.742001573 +0000 UTC m=+1853.736731785" lastFinishedPulling="2026-02-28 04:40:34.156155577 +0000 UTC m=+1856.150885759" observedRunningTime="2026-02-28 04:40:34.792129038 +0000 UTC m=+1856.786859230" watchObservedRunningTime="2026-02-28 04:40:34.795616577 +0000 UTC m=+1856.790346769" Feb 28 04:40:36 crc kubenswrapper[5072]: I0228 04:40:36.659501 5072 scope.go:117] "RemoveContainer" containerID="0e386676382b0c8efe27f89db4a4d5fa8b4258c234eaa3b9c041a33c00eda272" Feb 28 04:40:36 crc kubenswrapper[5072]: E0228 04:40:36.659758 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lrpf_openshift-machine-config-operator(a035bbab-1d8f-4120-aaf7-88984d936939)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" Feb 28 04:40:38 crc kubenswrapper[5072]: I0228 04:40:38.069970 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8mrsc" Feb 28 04:40:38 crc kubenswrapper[5072]: I0228 04:40:38.070361 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8mrsc" Feb 28 04:40:38 crc kubenswrapper[5072]: I0228 04:40:38.133911 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8mrsc" Feb 28 04:40:38 crc kubenswrapper[5072]: I0228 04:40:38.830470 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8mrsc" Feb 28 04:40:39 crc kubenswrapper[5072]: I0228 04:40:39.740555 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mrsc"] Feb 28 04:40:40 crc kubenswrapper[5072]: I0228 04:40:40.292992 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-scxx8" Feb 28 04:40:40 crc kubenswrapper[5072]: I0228 04:40:40.293099 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-scxx8" Feb 28 04:40:40 crc kubenswrapper[5072]: I0228 04:40:40.326586 5072 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-scxx8" Feb 28 04:40:40 crc kubenswrapper[5072]: I0228 04:40:40.797803 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8mrsc" podUID="857a50de-2e46-43c1-b7c9-d0a825faea2f" containerName="registry-server" containerID="cri-o://cfa0ca8c4286faee1ce864e47b56db1db48a007efbc2d6b44afcdbd30cc1d4c1" gracePeriod=2 Feb 28 04:40:40 crc kubenswrapper[5072]: I0228 04:40:40.842275 5072 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-scxx8" Feb 28 04:40:41 crc kubenswrapper[5072]: I0228 04:40:41.666938 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8mrsc" Feb 28 04:40:41 crc kubenswrapper[5072]: I0228 04:40:41.720993 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/857a50de-2e46-43c1-b7c9-d0a825faea2f-catalog-content\") pod \"857a50de-2e46-43c1-b7c9-d0a825faea2f\" (UID: \"857a50de-2e46-43c1-b7c9-d0a825faea2f\") " Feb 28 04:40:41 crc kubenswrapper[5072]: I0228 04:40:41.721140 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2smj\" (UniqueName: \"kubernetes.io/projected/857a50de-2e46-43c1-b7c9-d0a825faea2f-kube-api-access-w2smj\") pod \"857a50de-2e46-43c1-b7c9-d0a825faea2f\" (UID: \"857a50de-2e46-43c1-b7c9-d0a825faea2f\") " Feb 28 04:40:41 crc kubenswrapper[5072]: I0228 04:40:41.721189 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/857a50de-2e46-43c1-b7c9-d0a825faea2f-utilities\") pod \"857a50de-2e46-43c1-b7c9-d0a825faea2f\" (UID: \"857a50de-2e46-43c1-b7c9-d0a825faea2f\") " Feb 28 04:40:41 crc kubenswrapper[5072]: I0228 04:40:41.722526 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/857a50de-2e46-43c1-b7c9-d0a825faea2f-utilities" (OuterVolumeSpecName: "utilities") pod "857a50de-2e46-43c1-b7c9-d0a825faea2f" (UID: "857a50de-2e46-43c1-b7c9-d0a825faea2f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:40:41 crc kubenswrapper[5072]: I0228 04:40:41.726803 5072 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/857a50de-2e46-43c1-b7c9-d0a825faea2f-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 04:40:41 crc kubenswrapper[5072]: I0228 04:40:41.743807 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/857a50de-2e46-43c1-b7c9-d0a825faea2f-kube-api-access-w2smj" (OuterVolumeSpecName: "kube-api-access-w2smj") pod "857a50de-2e46-43c1-b7c9-d0a825faea2f" (UID: "857a50de-2e46-43c1-b7c9-d0a825faea2f"). InnerVolumeSpecName "kube-api-access-w2smj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:40:41 crc kubenswrapper[5072]: I0228 04:40:41.763192 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/857a50de-2e46-43c1-b7c9-d0a825faea2f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "857a50de-2e46-43c1-b7c9-d0a825faea2f" (UID: "857a50de-2e46-43c1-b7c9-d0a825faea2f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:40:41 crc kubenswrapper[5072]: I0228 04:40:41.807620 5072 generic.go:334] "Generic (PLEG): container finished" podID="857a50de-2e46-43c1-b7c9-d0a825faea2f" containerID="cfa0ca8c4286faee1ce864e47b56db1db48a007efbc2d6b44afcdbd30cc1d4c1" exitCode=0 Feb 28 04:40:41 crc kubenswrapper[5072]: I0228 04:40:41.808016 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8mrsc" Feb 28 04:40:41 crc kubenswrapper[5072]: I0228 04:40:41.808069 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mrsc" event={"ID":"857a50de-2e46-43c1-b7c9-d0a825faea2f","Type":"ContainerDied","Data":"cfa0ca8c4286faee1ce864e47b56db1db48a007efbc2d6b44afcdbd30cc1d4c1"} Feb 28 04:40:41 crc kubenswrapper[5072]: I0228 04:40:41.808151 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mrsc" event={"ID":"857a50de-2e46-43c1-b7c9-d0a825faea2f","Type":"ContainerDied","Data":"6b42818a62d41cececc9dfea644796aefbabc332e859e925b9857e702648196f"} Feb 28 04:40:41 crc kubenswrapper[5072]: I0228 04:40:41.808185 5072 scope.go:117] "RemoveContainer" containerID="cfa0ca8c4286faee1ce864e47b56db1db48a007efbc2d6b44afcdbd30cc1d4c1" Feb 28 04:40:41 crc kubenswrapper[5072]: I0228 04:40:41.828510 5072 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/857a50de-2e46-43c1-b7c9-d0a825faea2f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 04:40:41 crc kubenswrapper[5072]: I0228 04:40:41.828565 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2smj\" (UniqueName: \"kubernetes.io/projected/857a50de-2e46-43c1-b7c9-d0a825faea2f-kube-api-access-w2smj\") on node \"crc\" DevicePath \"\"" Feb 28 04:40:41 crc kubenswrapper[5072]: I0228 04:40:41.829756 5072 scope.go:117] "RemoveContainer" containerID="89852253b67a287501b856d2c1c1a260b90605b6128dd70f857ee6374e1c64e7" Feb 28 04:40:41 crc kubenswrapper[5072]: I0228 04:40:41.842889 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mrsc"] Feb 28 04:40:41 crc kubenswrapper[5072]: I0228 04:40:41.849439 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mrsc"] Feb 28 04:40:41 crc kubenswrapper[5072]: I0228 04:40:41.851015 5072 scope.go:117] "RemoveContainer" containerID="53fd29aa1e9318291c8cb939cb1e83236815e6a0b204cc7125c5181e7a7fae92" Feb 28 04:40:41 crc kubenswrapper[5072]: I0228 04:40:41.875821 5072 scope.go:117] "RemoveContainer" containerID="cfa0ca8c4286faee1ce864e47b56db1db48a007efbc2d6b44afcdbd30cc1d4c1" Feb 28 04:40:41 crc kubenswrapper[5072]: E0228 04:40:41.876223 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfa0ca8c4286faee1ce864e47b56db1db48a007efbc2d6b44afcdbd30cc1d4c1\": container with ID starting with cfa0ca8c4286faee1ce864e47b56db1db48a007efbc2d6b44afcdbd30cc1d4c1 not found: ID does not exist" containerID="cfa0ca8c4286faee1ce864e47b56db1db48a007efbc2d6b44afcdbd30cc1d4c1" Feb 28 04:40:41 crc kubenswrapper[5072]: I0228 04:40:41.876260 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfa0ca8c4286faee1ce864e47b56db1db48a007efbc2d6b44afcdbd30cc1d4c1"} err="failed to get container status \"cfa0ca8c4286faee1ce864e47b56db1db48a007efbc2d6b44afcdbd30cc1d4c1\": rpc error: code = NotFound desc = could not find container \"cfa0ca8c4286faee1ce864e47b56db1db48a007efbc2d6b44afcdbd30cc1d4c1\": container with ID starting with cfa0ca8c4286faee1ce864e47b56db1db48a007efbc2d6b44afcdbd30cc1d4c1 not found: ID does not exist" Feb 28 04:40:41 crc kubenswrapper[5072]: I0228 04:40:41.876285 5072 scope.go:117] "RemoveContainer" containerID="89852253b67a287501b856d2c1c1a260b90605b6128dd70f857ee6374e1c64e7" Feb 28 04:40:41 crc kubenswrapper[5072]: E0228 04:40:41.876529 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89852253b67a287501b856d2c1c1a260b90605b6128dd70f857ee6374e1c64e7\": container with ID starting with 89852253b67a287501b856d2c1c1a260b90605b6128dd70f857ee6374e1c64e7 not found: ID does not exist" containerID="89852253b67a287501b856d2c1c1a260b90605b6128dd70f857ee6374e1c64e7" Feb 28 04:40:41 crc kubenswrapper[5072]: I0228 04:40:41.876558 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89852253b67a287501b856d2c1c1a260b90605b6128dd70f857ee6374e1c64e7"} err="failed to get container status \"89852253b67a287501b856d2c1c1a260b90605b6128dd70f857ee6374e1c64e7\": rpc error: code = NotFound desc = could not find container \"89852253b67a287501b856d2c1c1a260b90605b6128dd70f857ee6374e1c64e7\": container with ID starting with 89852253b67a287501b856d2c1c1a260b90605b6128dd70f857ee6374e1c64e7 not found: ID does not exist" Feb 28 04:40:41 crc kubenswrapper[5072]: I0228 04:40:41.876575 5072 scope.go:117] "RemoveContainer" containerID="53fd29aa1e9318291c8cb939cb1e83236815e6a0b204cc7125c5181e7a7fae92" Feb 28 04:40:41 crc kubenswrapper[5072]: E0228 04:40:41.876973 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53fd29aa1e9318291c8cb939cb1e83236815e6a0b204cc7125c5181e7a7fae92\": container with ID starting with 53fd29aa1e9318291c8cb939cb1e83236815e6a0b204cc7125c5181e7a7fae92 not found: ID does not exist" containerID="53fd29aa1e9318291c8cb939cb1e83236815e6a0b204cc7125c5181e7a7fae92" Feb 28 04:40:41 crc kubenswrapper[5072]: I0228 04:40:41.877023 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53fd29aa1e9318291c8cb939cb1e83236815e6a0b204cc7125c5181e7a7fae92"} err="failed to get container status \"53fd29aa1e9318291c8cb939cb1e83236815e6a0b204cc7125c5181e7a7fae92\": rpc error: code = NotFound desc = could not find container \"53fd29aa1e9318291c8cb939cb1e83236815e6a0b204cc7125c5181e7a7fae92\": container with ID starting with 53fd29aa1e9318291c8cb939cb1e83236815e6a0b204cc7125c5181e7a7fae92 not found: ID does not exist" Feb 28 04:40:42 crc kubenswrapper[5072]: I0228 04:40:42.666359 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="857a50de-2e46-43c1-b7c9-d0a825faea2f" path="/var/lib/kubelet/pods/857a50de-2e46-43c1-b7c9-d0a825faea2f/volumes" Feb 28 04:40:42 crc kubenswrapper[5072]: I0228 04:40:42.732022 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-scxx8"] Feb 28 04:40:42 crc kubenswrapper[5072]: I0228 04:40:42.814286 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-scxx8" podUID="b5ea2b2b-7157-4e09-9318-095191b89acf" containerName="registry-server" containerID="cri-o://fcbad3c31a6e13148771890e27c761a93aebcac5d77d5d08849b19b5b806bb10" gracePeriod=2 Feb 28 04:40:43 crc kubenswrapper[5072]: I0228 04:40:43.824179 5072 generic.go:334] "Generic (PLEG): container finished" podID="b5ea2b2b-7157-4e09-9318-095191b89acf" containerID="fcbad3c31a6e13148771890e27c761a93aebcac5d77d5d08849b19b5b806bb10" exitCode=0 Feb 28 04:40:43 crc kubenswrapper[5072]: I0228 04:40:43.824555 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scxx8" event={"ID":"b5ea2b2b-7157-4e09-9318-095191b89acf","Type":"ContainerDied","Data":"fcbad3c31a6e13148771890e27c761a93aebcac5d77d5d08849b19b5b806bb10"} Feb 28 04:40:44 crc kubenswrapper[5072]: I0228 04:40:44.288510 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-scxx8" Feb 28 04:40:44 crc kubenswrapper[5072]: I0228 04:40:44.359260 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4v8z\" (UniqueName: \"kubernetes.io/projected/b5ea2b2b-7157-4e09-9318-095191b89acf-kube-api-access-m4v8z\") pod \"b5ea2b2b-7157-4e09-9318-095191b89acf\" (UID: \"b5ea2b2b-7157-4e09-9318-095191b89acf\") " Feb 28 04:40:44 crc kubenswrapper[5072]: I0228 04:40:44.359395 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5ea2b2b-7157-4e09-9318-095191b89acf-catalog-content\") pod \"b5ea2b2b-7157-4e09-9318-095191b89acf\" (UID: \"b5ea2b2b-7157-4e09-9318-095191b89acf\") " Feb 28 04:40:44 crc kubenswrapper[5072]: I0228 04:40:44.359416 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5ea2b2b-7157-4e09-9318-095191b89acf-utilities\") pod \"b5ea2b2b-7157-4e09-9318-095191b89acf\" (UID: \"b5ea2b2b-7157-4e09-9318-095191b89acf\") " Feb 28 04:40:44 crc kubenswrapper[5072]: I0228 04:40:44.360521 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5ea2b2b-7157-4e09-9318-095191b89acf-utilities" (OuterVolumeSpecName: "utilities") pod "b5ea2b2b-7157-4e09-9318-095191b89acf" (UID: "b5ea2b2b-7157-4e09-9318-095191b89acf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:40:44 crc kubenswrapper[5072]: I0228 04:40:44.367999 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5ea2b2b-7157-4e09-9318-095191b89acf-kube-api-access-m4v8z" (OuterVolumeSpecName: "kube-api-access-m4v8z") pod "b5ea2b2b-7157-4e09-9318-095191b89acf" (UID: "b5ea2b2b-7157-4e09-9318-095191b89acf"). InnerVolumeSpecName "kube-api-access-m4v8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:40:44 crc kubenswrapper[5072]: I0228 04:40:44.383581 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5ea2b2b-7157-4e09-9318-095191b89acf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b5ea2b2b-7157-4e09-9318-095191b89acf" (UID: "b5ea2b2b-7157-4e09-9318-095191b89acf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:40:44 crc kubenswrapper[5072]: I0228 04:40:44.460829 5072 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5ea2b2b-7157-4e09-9318-095191b89acf-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 04:40:44 crc kubenswrapper[5072]: I0228 04:40:44.460881 5072 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5ea2b2b-7157-4e09-9318-095191b89acf-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 04:40:44 crc kubenswrapper[5072]: I0228 04:40:44.460903 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4v8z\" (UniqueName: \"kubernetes.io/projected/b5ea2b2b-7157-4e09-9318-095191b89acf-kube-api-access-m4v8z\") on node \"crc\" DevicePath \"\"" Feb 28 04:40:44 crc kubenswrapper[5072]: I0228 04:40:44.830127 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scxx8" event={"ID":"b5ea2b2b-7157-4e09-9318-095191b89acf","Type":"ContainerDied","Data":"935fed2e0107175dcf2d4128e0b4e64dbbbea03bb5679ef6b8e07d80a72830f6"} Feb 28 04:40:44 crc kubenswrapper[5072]: I0228 04:40:44.830171 5072 scope.go:117] "RemoveContainer" containerID="fcbad3c31a6e13148771890e27c761a93aebcac5d77d5d08849b19b5b806bb10" Feb 28 04:40:44 crc kubenswrapper[5072]: I0228 04:40:44.830252 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-scxx8" Feb 28 04:40:44 crc kubenswrapper[5072]: I0228 04:40:44.848023 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-scxx8"] Feb 28 04:40:44 crc kubenswrapper[5072]: I0228 04:40:44.851884 5072 scope.go:117] "RemoveContainer" containerID="cbd5eb7f815db901975ae3633faa4e7cfdc2b5d8ccbf17303be98ab8ee79b67c" Feb 28 04:40:44 crc kubenswrapper[5072]: I0228 04:40:44.853299 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-scxx8"] Feb 28 04:40:44 crc kubenswrapper[5072]: I0228 04:40:44.870280 5072 scope.go:117] "RemoveContainer" containerID="d34294921678043e75ff8dba244ebc0176dcb696337a1c15a6e16cf78784d072" Feb 28 04:40:46 crc kubenswrapper[5072]: I0228 04:40:46.668106 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5ea2b2b-7157-4e09-9318-095191b89acf" path="/var/lib/kubelet/pods/b5ea2b2b-7157-4e09-9318-095191b89acf/volumes" Feb 28 04:40:46 crc kubenswrapper[5072]: I0228 04:40:46.843259 5072 generic.go:334] "Generic (PLEG): container finished" podID="82b82932-4d4c-4728-8067-7dff076dac55" containerID="a3c3a2d4af1943b4016757b187115bfc811a315c69b3caa937326ffc4c3d7e08" exitCode=0 Feb 28 04:40:46 crc kubenswrapper[5072]: I0228 04:40:46.843311 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2trmj/must-gather-rftc5" event={"ID":"82b82932-4d4c-4728-8067-7dff076dac55","Type":"ContainerDied","Data":"a3c3a2d4af1943b4016757b187115bfc811a315c69b3caa937326ffc4c3d7e08"} Feb 28 04:40:46 crc kubenswrapper[5072]: I0228 04:40:46.844560 5072 scope.go:117] "RemoveContainer" containerID="a3c3a2d4af1943b4016757b187115bfc811a315c69b3caa937326ffc4c3d7e08" Feb 28 04:40:46 crc kubenswrapper[5072]: I0228 04:40:46.881791 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2trmj_must-gather-rftc5_82b82932-4d4c-4728-8067-7dff076dac55/gather/0.log" Feb 28 04:40:49 crc kubenswrapper[5072]: I0228 04:40:49.659019 5072 scope.go:117] "RemoveContainer" containerID="0e386676382b0c8efe27f89db4a4d5fa8b4258c234eaa3b9c041a33c00eda272" Feb 28 04:40:49 crc kubenswrapper[5072]: E0228 04:40:49.659482 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lrpf_openshift-machine-config-operator(a035bbab-1d8f-4120-aaf7-88984d936939)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" Feb 28 04:40:56 crc kubenswrapper[5072]: I0228 04:40:56.007469 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2trmj/must-gather-rftc5"] Feb 28 04:40:56 crc kubenswrapper[5072]: I0228 04:40:56.009067 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-2trmj/must-gather-rftc5" podUID="82b82932-4d4c-4728-8067-7dff076dac55" containerName="copy" containerID="cri-o://011e80d0f5ed633cd8045e64ee716a2f8d57b5513ba7c27fb9dd3c4df08a3807" gracePeriod=2 Feb 28 04:40:56 crc kubenswrapper[5072]: I0228 04:40:56.013609 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2trmj/must-gather-rftc5"] Feb 28 04:40:56 crc kubenswrapper[5072]: I0228 04:40:56.339253 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2trmj_must-gather-rftc5_82b82932-4d4c-4728-8067-7dff076dac55/copy/0.log" Feb 28 04:40:56 crc kubenswrapper[5072]: I0228 04:40:56.340164 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2trmj/must-gather-rftc5" Feb 28 04:40:56 crc kubenswrapper[5072]: I0228 04:40:56.434180 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzvmv\" (UniqueName: \"kubernetes.io/projected/82b82932-4d4c-4728-8067-7dff076dac55-kube-api-access-kzvmv\") pod \"82b82932-4d4c-4728-8067-7dff076dac55\" (UID: \"82b82932-4d4c-4728-8067-7dff076dac55\") " Feb 28 04:40:56 crc kubenswrapper[5072]: I0228 04:40:56.434487 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/82b82932-4d4c-4728-8067-7dff076dac55-must-gather-output\") pod \"82b82932-4d4c-4728-8067-7dff076dac55\" (UID: \"82b82932-4d4c-4728-8067-7dff076dac55\") " Feb 28 04:40:56 crc kubenswrapper[5072]: I0228 04:40:56.439532 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82b82932-4d4c-4728-8067-7dff076dac55-kube-api-access-kzvmv" (OuterVolumeSpecName: "kube-api-access-kzvmv") pod "82b82932-4d4c-4728-8067-7dff076dac55" (UID: "82b82932-4d4c-4728-8067-7dff076dac55"). InnerVolumeSpecName "kube-api-access-kzvmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:40:56 crc kubenswrapper[5072]: I0228 04:40:56.491774 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82b82932-4d4c-4728-8067-7dff076dac55-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "82b82932-4d4c-4728-8067-7dff076dac55" (UID: "82b82932-4d4c-4728-8067-7dff076dac55"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:40:56 crc kubenswrapper[5072]: I0228 04:40:56.535798 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzvmv\" (UniqueName: \"kubernetes.io/projected/82b82932-4d4c-4728-8067-7dff076dac55-kube-api-access-kzvmv\") on node \"crc\" DevicePath \"\"" Feb 28 04:40:56 crc kubenswrapper[5072]: I0228 04:40:56.535828 5072 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/82b82932-4d4c-4728-8067-7dff076dac55-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 28 04:40:56 crc kubenswrapper[5072]: I0228 04:40:56.678437 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82b82932-4d4c-4728-8067-7dff076dac55" path="/var/lib/kubelet/pods/82b82932-4d4c-4728-8067-7dff076dac55/volumes" Feb 28 04:40:56 crc kubenswrapper[5072]: I0228 04:40:56.902013 5072 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2trmj_must-gather-rftc5_82b82932-4d4c-4728-8067-7dff076dac55/copy/0.log" Feb 28 04:40:56 crc kubenswrapper[5072]: I0228 04:40:56.902668 5072 generic.go:334] "Generic (PLEG): container finished" podID="82b82932-4d4c-4728-8067-7dff076dac55" containerID="011e80d0f5ed633cd8045e64ee716a2f8d57b5513ba7c27fb9dd3c4df08a3807" exitCode=143 Feb 28 04:40:56 crc kubenswrapper[5072]: I0228 04:40:56.902720 5072 scope.go:117] "RemoveContainer" containerID="011e80d0f5ed633cd8045e64ee716a2f8d57b5513ba7c27fb9dd3c4df08a3807" Feb 28 04:40:56 crc kubenswrapper[5072]: I0228 04:40:56.902874 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2trmj/must-gather-rftc5" Feb 28 04:40:56 crc kubenswrapper[5072]: I0228 04:40:56.923889 5072 scope.go:117] "RemoveContainer" containerID="a3c3a2d4af1943b4016757b187115bfc811a315c69b3caa937326ffc4c3d7e08" Feb 28 04:40:56 crc kubenswrapper[5072]: I0228 04:40:56.975805 5072 scope.go:117] "RemoveContainer" containerID="011e80d0f5ed633cd8045e64ee716a2f8d57b5513ba7c27fb9dd3c4df08a3807" Feb 28 04:40:56 crc kubenswrapper[5072]: E0228 04:40:56.976457 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"011e80d0f5ed633cd8045e64ee716a2f8d57b5513ba7c27fb9dd3c4df08a3807\": container with ID starting with 011e80d0f5ed633cd8045e64ee716a2f8d57b5513ba7c27fb9dd3c4df08a3807 not found: ID does not exist" containerID="011e80d0f5ed633cd8045e64ee716a2f8d57b5513ba7c27fb9dd3c4df08a3807" Feb 28 04:40:56 crc kubenswrapper[5072]: I0228 04:40:56.976563 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"011e80d0f5ed633cd8045e64ee716a2f8d57b5513ba7c27fb9dd3c4df08a3807"} err="failed to get container status \"011e80d0f5ed633cd8045e64ee716a2f8d57b5513ba7c27fb9dd3c4df08a3807\": rpc error: code = NotFound desc = could not find container \"011e80d0f5ed633cd8045e64ee716a2f8d57b5513ba7c27fb9dd3c4df08a3807\": container with ID starting with 011e80d0f5ed633cd8045e64ee716a2f8d57b5513ba7c27fb9dd3c4df08a3807 not found: ID does not exist" Feb 28 04:40:56 crc kubenswrapper[5072]: I0228 04:40:56.976659 5072 scope.go:117] "RemoveContainer" containerID="a3c3a2d4af1943b4016757b187115bfc811a315c69b3caa937326ffc4c3d7e08" Feb 28 04:40:56 crc kubenswrapper[5072]: E0228 04:40:56.977066 5072 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3c3a2d4af1943b4016757b187115bfc811a315c69b3caa937326ffc4c3d7e08\": container with ID starting with a3c3a2d4af1943b4016757b187115bfc811a315c69b3caa937326ffc4c3d7e08 not found: ID does not exist" containerID="a3c3a2d4af1943b4016757b187115bfc811a315c69b3caa937326ffc4c3d7e08" Feb 28 04:40:56 crc kubenswrapper[5072]: I0228 04:40:56.977153 5072 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3c3a2d4af1943b4016757b187115bfc811a315c69b3caa937326ffc4c3d7e08"} err="failed to get container status \"a3c3a2d4af1943b4016757b187115bfc811a315c69b3caa937326ffc4c3d7e08\": rpc error: code = NotFound desc = could not find container \"a3c3a2d4af1943b4016757b187115bfc811a315c69b3caa937326ffc4c3d7e08\": container with ID starting with a3c3a2d4af1943b4016757b187115bfc811a315c69b3caa937326ffc4c3d7e08 not found: ID does not exist" Feb 28 04:41:04 crc kubenswrapper[5072]: I0228 04:41:04.658718 5072 scope.go:117] "RemoveContainer" containerID="0e386676382b0c8efe27f89db4a4d5fa8b4258c234eaa3b9c041a33c00eda272" Feb 28 04:41:04 crc kubenswrapper[5072]: E0228 04:41:04.659631 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lrpf_openshift-machine-config-operator(a035bbab-1d8f-4120-aaf7-88984d936939)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" Feb 28 04:41:19 crc kubenswrapper[5072]: I0228 04:41:19.659315 5072 scope.go:117] "RemoveContainer" containerID="0e386676382b0c8efe27f89db4a4d5fa8b4258c234eaa3b9c041a33c00eda272" Feb 28 04:41:19 crc kubenswrapper[5072]: E0228 04:41:19.660205 5072 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lrpf_openshift-machine-config-operator(a035bbab-1d8f-4120-aaf7-88984d936939)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" Feb 28 04:41:24 crc kubenswrapper[5072]: I0228 04:41:24.545219 5072 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 28 04:41:24 crc kubenswrapper[5072]: I0228 04:41:24.546085 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 28 04:41:24 crc kubenswrapper[5072]: I0228 04:41:24.587957 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-6b95579fd-hmq5d" podUID="67512bfe-55b8-4df0-aa98-54225fc624a3" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.20:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 28 04:41:25 crc kubenswrapper[5072]: I0228 04:41:25.741379 5072 patch_prober.go:28] interesting pod/controller-manager-85c5bddd8b-x7wlb container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 28 04:41:25 crc kubenswrapper[5072]: I0228 04:41:25.741497 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-85c5bddd8b-x7wlb" podUID="59e44cd9-beba-43f0-966d-32147af5d418" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 28 04:41:25 crc kubenswrapper[5072]: I0228 04:41:25.742241 5072 patch_prober.go:28] interesting pod/route-controller-manager-7c5bcd9558-68fpj container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.71:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 28 04:41:25 crc kubenswrapper[5072]: I0228 04:41:25.742382 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-7c5bcd9558-68fpj" podUID="f226bc96-3265-4f70-8949-e0acba8ad2da" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.71:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 28 04:41:25 crc kubenswrapper[5072]: I0228 04:41:25.742539 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-l4kss" podUID="fc3e2173-5582-4edb-b330-fb46053b22e2" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 28 04:41:25 crc kubenswrapper[5072]: I0228 04:41:25.752967 5072 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-bsswj container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 28 04:41:25 crc kubenswrapper[5072]: I0228 04:41:25.753030 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-bsswj" podUID="f0aaa88f-ecf9-47b0-9349-737e855a9ed4" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 28 04:41:25 crc kubenswrapper[5072]: I0228 04:41:25.765971 5072 patch_prober.go:28] interesting pod/route-controller-manager-7c5bcd9558-68fpj container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.71:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 28 04:41:25 crc kubenswrapper[5072]: I0228 04:41:25.766052 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7c5bcd9558-68fpj" podUID="f226bc96-3265-4f70-8949-e0acba8ad2da" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.71:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 28 04:41:25 crc kubenswrapper[5072]: I0228 04:41:25.766436 5072 patch_prober.go:28] interesting pod/console-operator-58897d9998-49m8v container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 28 04:41:25 crc kubenswrapper[5072]: I0228 04:41:25.766509 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-49m8v" podUID="c36ce709-c726-4390-abb9-2ebcaecbf1c0" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 28 04:41:25 crc kubenswrapper[5072]: I0228 04:41:25.766598 5072 patch_prober.go:28] interesting pod/console-operator-58897d9998-49m8v container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 28 04:41:25 crc kubenswrapper[5072]: I0228 04:41:25.766620 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-49m8v" podUID="c36ce709-c726-4390-abb9-2ebcaecbf1c0" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 28 04:41:25 crc kubenswrapper[5072]: I0228 04:41:25.769164 5072 patch_prober.go:28] interesting pod/controller-manager-85c5bddd8b-x7wlb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 28 04:41:25 crc kubenswrapper[5072]: I0228 04:41:25.769211 5072 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-85c5bddd8b-x7wlb" podUID="59e44cd9-beba-43f0-966d-32147af5d418" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 28 04:41:25 crc kubenswrapper[5072]: E0228 04:41:25.874075 5072 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.215s" Feb 28 04:41:30 crc kubenswrapper[5072]: I0228 04:41:30.659220 5072 scope.go:117] "RemoveContainer" containerID="0e386676382b0c8efe27f89db4a4d5fa8b4258c234eaa3b9c041a33c00eda272" Feb 28 04:41:30 crc kubenswrapper[5072]: I0228 04:41:30.913329 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" event={"ID":"a035bbab-1d8f-4120-aaf7-88984d936939","Type":"ContainerStarted","Data":"646f9b427f5a88bcade12c73e0a063776d79c4edeac0e3dbac29c73e0f9a9891"} Feb 28 04:42:00 crc kubenswrapper[5072]: I0228 04:42:00.141951 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537562-snzzz"] Feb 28 04:42:00 crc kubenswrapper[5072]: E0228 04:42:00.144211 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="857a50de-2e46-43c1-b7c9-d0a825faea2f" containerName="extract-content" Feb 28 04:42:00 crc kubenswrapper[5072]: I0228 04:42:00.144323 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="857a50de-2e46-43c1-b7c9-d0a825faea2f" containerName="extract-content" Feb 28 04:42:00 crc kubenswrapper[5072]: E0228 04:42:00.144421 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5ea2b2b-7157-4e09-9318-095191b89acf" containerName="extract-utilities" Feb 28 04:42:00 crc kubenswrapper[5072]: I0228 04:42:00.144502 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5ea2b2b-7157-4e09-9318-095191b89acf" containerName="extract-utilities" Feb 28 04:42:00 crc kubenswrapper[5072]: E0228 04:42:00.144581 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5ea2b2b-7157-4e09-9318-095191b89acf" containerName="extract-content" Feb 28 04:42:00 crc kubenswrapper[5072]: I0228 04:42:00.144674 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5ea2b2b-7157-4e09-9318-095191b89acf" containerName="extract-content" Feb 28 04:42:00 crc kubenswrapper[5072]: E0228 04:42:00.144769 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="857a50de-2e46-43c1-b7c9-d0a825faea2f" containerName="registry-server" Feb 28 04:42:00 crc kubenswrapper[5072]: I0228 04:42:00.144859 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="857a50de-2e46-43c1-b7c9-d0a825faea2f" containerName="registry-server" Feb 28 04:42:00 crc kubenswrapper[5072]: E0228 04:42:00.144944 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82b82932-4d4c-4728-8067-7dff076dac55" containerName="gather" Feb 28 04:42:00 crc kubenswrapper[5072]: I0228 04:42:00.145020 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="82b82932-4d4c-4728-8067-7dff076dac55" containerName="gather" Feb 28 04:42:00 crc kubenswrapper[5072]: E0228 04:42:00.145094 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="857a50de-2e46-43c1-b7c9-d0a825faea2f" containerName="extract-utilities" Feb 28 04:42:00 crc kubenswrapper[5072]: I0228 04:42:00.145183 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="857a50de-2e46-43c1-b7c9-d0a825faea2f" containerName="extract-utilities" Feb 28 04:42:00 crc kubenswrapper[5072]: E0228 04:42:00.145284 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5ea2b2b-7157-4e09-9318-095191b89acf" containerName="registry-server" Feb 28 04:42:00 crc kubenswrapper[5072]: I0228 04:42:00.145359 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5ea2b2b-7157-4e09-9318-095191b89acf" containerName="registry-server" Feb 28 04:42:00 crc kubenswrapper[5072]: E0228 04:42:00.145602 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82b82932-4d4c-4728-8067-7dff076dac55" containerName="copy" Feb 28 04:42:00 crc kubenswrapper[5072]: I0228 04:42:00.145710 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="82b82932-4d4c-4728-8067-7dff076dac55" containerName="copy" Feb 28 04:42:00 crc kubenswrapper[5072]: I0228 04:42:00.145954 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="82b82932-4d4c-4728-8067-7dff076dac55" containerName="gather" Feb 28 04:42:00 crc kubenswrapper[5072]: I0228 04:42:00.146052 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="857a50de-2e46-43c1-b7c9-d0a825faea2f" containerName="registry-server" Feb 28 04:42:00 crc kubenswrapper[5072]: I0228 04:42:00.146134 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5ea2b2b-7157-4e09-9318-095191b89acf" containerName="registry-server" Feb 28 04:42:00 crc kubenswrapper[5072]: I0228 04:42:00.146254 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="82b82932-4d4c-4728-8067-7dff076dac55" containerName="copy" Feb 28 04:42:00 crc kubenswrapper[5072]: I0228 04:42:00.146880 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537562-snzzz"] Feb 28 04:42:00 crc kubenswrapper[5072]: I0228 04:42:00.147096 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537562-snzzz" Feb 28 04:42:00 crc kubenswrapper[5072]: I0228 04:42:00.149575 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 04:42:00 crc kubenswrapper[5072]: I0228 04:42:00.149886 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-c8kvx" Feb 28 04:42:00 crc kubenswrapper[5072]: I0228 04:42:00.151414 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 04:42:00 crc kubenswrapper[5072]: I0228 04:42:00.339400 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs4lb\" (UniqueName: \"kubernetes.io/projected/97ae5729-4226-4ab7-98a6-300bf6f2004c-kube-api-access-zs4lb\") pod \"auto-csr-approver-29537562-snzzz\" (UID: \"97ae5729-4226-4ab7-98a6-300bf6f2004c\") " pod="openshift-infra/auto-csr-approver-29537562-snzzz" Feb 28 04:42:00 crc kubenswrapper[5072]: I0228 04:42:00.440799 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs4lb\" (UniqueName: \"kubernetes.io/projected/97ae5729-4226-4ab7-98a6-300bf6f2004c-kube-api-access-zs4lb\") pod \"auto-csr-approver-29537562-snzzz\" (UID: \"97ae5729-4226-4ab7-98a6-300bf6f2004c\") " pod="openshift-infra/auto-csr-approver-29537562-snzzz" Feb 28 04:42:00 crc kubenswrapper[5072]: I0228 04:42:00.459328 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs4lb\" (UniqueName: \"kubernetes.io/projected/97ae5729-4226-4ab7-98a6-300bf6f2004c-kube-api-access-zs4lb\") pod \"auto-csr-approver-29537562-snzzz\" (UID: \"97ae5729-4226-4ab7-98a6-300bf6f2004c\") " pod="openshift-infra/auto-csr-approver-29537562-snzzz" Feb 28 04:42:00 crc kubenswrapper[5072]: I0228 04:42:00.481622 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537562-snzzz" Feb 28 04:42:00 crc kubenswrapper[5072]: I0228 04:42:00.656431 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537562-snzzz"] Feb 28 04:42:01 crc kubenswrapper[5072]: I0228 04:42:01.143182 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537562-snzzz" event={"ID":"97ae5729-4226-4ab7-98a6-300bf6f2004c","Type":"ContainerStarted","Data":"d51e8b6e2818cc509cf6901a0b2e8507e56d5e3620f8646752775ae02d53d5c7"} Feb 28 04:42:04 crc kubenswrapper[5072]: I0228 04:42:04.163443 5072 generic.go:334] "Generic (PLEG): container finished" podID="97ae5729-4226-4ab7-98a6-300bf6f2004c" containerID="14a665789fbfd05e9761ef520980031344e4a57af5edd12b94e1b008a47495bf" exitCode=0 Feb 28 04:42:04 crc kubenswrapper[5072]: I0228 04:42:04.163510 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537562-snzzz" event={"ID":"97ae5729-4226-4ab7-98a6-300bf6f2004c","Type":"ContainerDied","Data":"14a665789fbfd05e9761ef520980031344e4a57af5edd12b94e1b008a47495bf"} Feb 28 04:42:05 crc kubenswrapper[5072]: I0228 04:42:05.349945 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537562-snzzz" Feb 28 04:42:05 crc kubenswrapper[5072]: I0228 04:42:05.507103 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zs4lb\" (UniqueName: \"kubernetes.io/projected/97ae5729-4226-4ab7-98a6-300bf6f2004c-kube-api-access-zs4lb\") pod \"97ae5729-4226-4ab7-98a6-300bf6f2004c\" (UID: \"97ae5729-4226-4ab7-98a6-300bf6f2004c\") " Feb 28 04:42:05 crc kubenswrapper[5072]: I0228 04:42:05.512306 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97ae5729-4226-4ab7-98a6-300bf6f2004c-kube-api-access-zs4lb" (OuterVolumeSpecName: "kube-api-access-zs4lb") pod "97ae5729-4226-4ab7-98a6-300bf6f2004c" (UID: "97ae5729-4226-4ab7-98a6-300bf6f2004c"). InnerVolumeSpecName "kube-api-access-zs4lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:42:05 crc kubenswrapper[5072]: I0228 04:42:05.609025 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zs4lb\" (UniqueName: \"kubernetes.io/projected/97ae5729-4226-4ab7-98a6-300bf6f2004c-kube-api-access-zs4lb\") on node \"crc\" DevicePath \"\"" Feb 28 04:42:06 crc kubenswrapper[5072]: I0228 04:42:06.200067 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537562-snzzz" event={"ID":"97ae5729-4226-4ab7-98a6-300bf6f2004c","Type":"ContainerDied","Data":"d51e8b6e2818cc509cf6901a0b2e8507e56d5e3620f8646752775ae02d53d5c7"} Feb 28 04:42:06 crc kubenswrapper[5072]: I0228 04:42:06.200377 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d51e8b6e2818cc509cf6901a0b2e8507e56d5e3620f8646752775ae02d53d5c7" Feb 28 04:42:06 crc kubenswrapper[5072]: I0228 04:42:06.200186 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537562-snzzz" Feb 28 04:42:06 crc kubenswrapper[5072]: I0228 04:42:06.401805 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537556-66srt"] Feb 28 04:42:06 crc kubenswrapper[5072]: I0228 04:42:06.405717 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537556-66srt"] Feb 28 04:42:06 crc kubenswrapper[5072]: I0228 04:42:06.667569 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24f274e3-9468-4418-9b22-ccda48145e19" path="/var/lib/kubelet/pods/24f274e3-9468-4418-9b22-ccda48145e19/volumes" Feb 28 04:42:22 crc kubenswrapper[5072]: I0228 04:42:22.828287 5072 scope.go:117] "RemoveContainer" containerID="c32e3fb8e067a9f85a7dab7e939e8d29d3259641af0f724a1e446f92cabdfa52" Feb 28 04:43:50 crc kubenswrapper[5072]: I0228 04:43:50.105394 5072 patch_prober.go:28] interesting pod/machine-config-daemon-5lrpf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:43:50 crc kubenswrapper[5072]: I0228 04:43:50.105941 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:44:00 crc kubenswrapper[5072]: I0228 04:44:00.134462 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537564-xxr7l"] Feb 28 04:44:00 crc kubenswrapper[5072]: E0228 04:44:00.137242 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97ae5729-4226-4ab7-98a6-300bf6f2004c" containerName="oc" Feb 28 04:44:00 crc kubenswrapper[5072]: I0228 04:44:00.137273 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="97ae5729-4226-4ab7-98a6-300bf6f2004c" containerName="oc" Feb 28 04:44:00 crc kubenswrapper[5072]: I0228 04:44:00.137405 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="97ae5729-4226-4ab7-98a6-300bf6f2004c" containerName="oc" Feb 28 04:44:00 crc kubenswrapper[5072]: I0228 04:44:00.137787 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537564-xxr7l" Feb 28 04:44:00 crc kubenswrapper[5072]: I0228 04:44:00.144814 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 04:44:00 crc kubenswrapper[5072]: I0228 04:44:00.145000 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 04:44:00 crc kubenswrapper[5072]: I0228 04:44:00.145028 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-c8kvx" Feb 28 04:44:00 crc kubenswrapper[5072]: I0228 04:44:00.145745 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537564-xxr7l"] Feb 28 04:44:00 crc kubenswrapper[5072]: I0228 04:44:00.264255 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skz5k\" (UniqueName: \"kubernetes.io/projected/b9f46694-6323-483b-a0cf-c25a7c663da0-kube-api-access-skz5k\") pod \"auto-csr-approver-29537564-xxr7l\" (UID: \"b9f46694-6323-483b-a0cf-c25a7c663da0\") " pod="openshift-infra/auto-csr-approver-29537564-xxr7l" Feb 28 04:44:00 crc kubenswrapper[5072]: I0228 04:44:00.365843 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skz5k\" (UniqueName: \"kubernetes.io/projected/b9f46694-6323-483b-a0cf-c25a7c663da0-kube-api-access-skz5k\") pod \"auto-csr-approver-29537564-xxr7l\" (UID: \"b9f46694-6323-483b-a0cf-c25a7c663da0\") " pod="openshift-infra/auto-csr-approver-29537564-xxr7l" Feb 28 04:44:00 crc kubenswrapper[5072]: I0228 04:44:00.405627 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skz5k\" (UniqueName: \"kubernetes.io/projected/b9f46694-6323-483b-a0cf-c25a7c663da0-kube-api-access-skz5k\") pod \"auto-csr-approver-29537564-xxr7l\" (UID: \"b9f46694-6323-483b-a0cf-c25a7c663da0\") " pod="openshift-infra/auto-csr-approver-29537564-xxr7l" Feb 28 04:44:00 crc kubenswrapper[5072]: I0228 04:44:00.467356 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537564-xxr7l" Feb 28 04:44:00 crc kubenswrapper[5072]: I0228 04:44:00.650274 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537564-xxr7l"] Feb 28 04:44:00 crc kubenswrapper[5072]: I0228 04:44:00.664566 5072 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 28 04:44:01 crc kubenswrapper[5072]: I0228 04:44:01.176114 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537564-xxr7l" event={"ID":"b9f46694-6323-483b-a0cf-c25a7c663da0","Type":"ContainerStarted","Data":"d79ff9f193eeca1a0712ebfc1f3b4eb970c2bcbfc078494a4322d203e6f3cd11"} Feb 28 04:44:03 crc kubenswrapper[5072]: I0228 04:44:03.188746 5072 generic.go:334] "Generic (PLEG): container finished" podID="b9f46694-6323-483b-a0cf-c25a7c663da0" containerID="a398219577fca0077e543e637b5f55d7afc8b7054499524836f1d149f00776c3" exitCode=0 Feb 28 04:44:03 crc kubenswrapper[5072]: I0228 04:44:03.188940 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537564-xxr7l" event={"ID":"b9f46694-6323-483b-a0cf-c25a7c663da0","Type":"ContainerDied","Data":"a398219577fca0077e543e637b5f55d7afc8b7054499524836f1d149f00776c3"} Feb 28 04:44:04 crc kubenswrapper[5072]: I0228 04:44:04.406590 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537564-xxr7l" Feb 28 04:44:04 crc kubenswrapper[5072]: I0228 04:44:04.516276 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skz5k\" (UniqueName: \"kubernetes.io/projected/b9f46694-6323-483b-a0cf-c25a7c663da0-kube-api-access-skz5k\") pod \"b9f46694-6323-483b-a0cf-c25a7c663da0\" (UID: \"b9f46694-6323-483b-a0cf-c25a7c663da0\") " Feb 28 04:44:04 crc kubenswrapper[5072]: I0228 04:44:04.529845 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9f46694-6323-483b-a0cf-c25a7c663da0-kube-api-access-skz5k" (OuterVolumeSpecName: "kube-api-access-skz5k") pod "b9f46694-6323-483b-a0cf-c25a7c663da0" (UID: "b9f46694-6323-483b-a0cf-c25a7c663da0"). InnerVolumeSpecName "kube-api-access-skz5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:44:04 crc kubenswrapper[5072]: I0228 04:44:04.617587 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skz5k\" (UniqueName: \"kubernetes.io/projected/b9f46694-6323-483b-a0cf-c25a7c663da0-kube-api-access-skz5k\") on node \"crc\" DevicePath \"\"" Feb 28 04:44:05 crc kubenswrapper[5072]: I0228 04:44:05.200632 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537564-xxr7l" event={"ID":"b9f46694-6323-483b-a0cf-c25a7c663da0","Type":"ContainerDied","Data":"d79ff9f193eeca1a0712ebfc1f3b4eb970c2bcbfc078494a4322d203e6f3cd11"} Feb 28 04:44:05 crc kubenswrapper[5072]: I0228 04:44:05.200708 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d79ff9f193eeca1a0712ebfc1f3b4eb970c2bcbfc078494a4322d203e6f3cd11" Feb 28 04:44:05 crc kubenswrapper[5072]: I0228 04:44:05.200951 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537564-xxr7l" Feb 28 04:44:05 crc kubenswrapper[5072]: I0228 04:44:05.491033 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537558-dbqw5"] Feb 28 04:44:05 crc kubenswrapper[5072]: I0228 04:44:05.495612 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537558-dbqw5"] Feb 28 04:44:06 crc kubenswrapper[5072]: I0228 04:44:06.673478 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea5b72ea-d4d2-40bd-a15e-f7baad745f88" path="/var/lib/kubelet/pods/ea5b72ea-d4d2-40bd-a15e-f7baad745f88/volumes" Feb 28 04:44:20 crc kubenswrapper[5072]: I0228 04:44:20.105196 5072 patch_prober.go:28] interesting pod/machine-config-daemon-5lrpf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:44:20 crc kubenswrapper[5072]: I0228 04:44:20.105726 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:44:22 crc kubenswrapper[5072]: I0228 04:44:22.892022 5072 scope.go:117] "RemoveContainer" containerID="979560e5bcfc8e486bcf3571dfbc647a1148f4ae7d998e920da66ae49174bad5" Feb 28 04:44:50 crc kubenswrapper[5072]: I0228 04:44:50.106204 5072 patch_prober.go:28] interesting pod/machine-config-daemon-5lrpf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:44:50 crc kubenswrapper[5072]: I0228 04:44:50.106777 5072 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:44:50 crc kubenswrapper[5072]: I0228 04:44:50.106822 5072 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" Feb 28 04:44:50 crc kubenswrapper[5072]: I0228 04:44:50.107339 5072 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"646f9b427f5a88bcade12c73e0a063776d79c4edeac0e3dbac29c73e0f9a9891"} pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 28 04:44:50 crc kubenswrapper[5072]: I0228 04:44:50.107388 5072 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" podUID="a035bbab-1d8f-4120-aaf7-88984d936939" containerName="machine-config-daemon" containerID="cri-o://646f9b427f5a88bcade12c73e0a063776d79c4edeac0e3dbac29c73e0f9a9891" gracePeriod=600 Feb 28 04:44:50 crc kubenswrapper[5072]: I0228 04:44:50.435737 5072 generic.go:334] "Generic (PLEG): container finished" podID="a035bbab-1d8f-4120-aaf7-88984d936939" containerID="646f9b427f5a88bcade12c73e0a063776d79c4edeac0e3dbac29c73e0f9a9891" exitCode=0 Feb 28 04:44:50 crc kubenswrapper[5072]: I0228 04:44:50.435804 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" event={"ID":"a035bbab-1d8f-4120-aaf7-88984d936939","Type":"ContainerDied","Data":"646f9b427f5a88bcade12c73e0a063776d79c4edeac0e3dbac29c73e0f9a9891"} Feb 28 04:44:50 crc kubenswrapper[5072]: I0228 04:44:50.436068 5072 scope.go:117] "RemoveContainer" containerID="0e386676382b0c8efe27f89db4a4d5fa8b4258c234eaa3b9c041a33c00eda272" Feb 28 04:44:51 crc kubenswrapper[5072]: I0228 04:44:51.442847 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lrpf" event={"ID":"a035bbab-1d8f-4120-aaf7-88984d936939","Type":"ContainerStarted","Data":"ef6b22cc608fda96d8dd1b437c8885a59a8a3089d679337759ea2a75cb57e182"} Feb 28 04:45:00 crc kubenswrapper[5072]: I0228 04:45:00.142238 5072 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537565-wsd7f"] Feb 28 04:45:00 crc kubenswrapper[5072]: E0228 04:45:00.143046 5072 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9f46694-6323-483b-a0cf-c25a7c663da0" containerName="oc" Feb 28 04:45:00 crc kubenswrapper[5072]: I0228 04:45:00.143058 5072 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9f46694-6323-483b-a0cf-c25a7c663da0" containerName="oc" Feb 28 04:45:00 crc kubenswrapper[5072]: I0228 04:45:00.143157 5072 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9f46694-6323-483b-a0cf-c25a7c663da0" containerName="oc" Feb 28 04:45:00 crc kubenswrapper[5072]: I0228 04:45:00.143672 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537565-wsd7f" Feb 28 04:45:00 crc kubenswrapper[5072]: I0228 04:45:00.146902 5072 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 28 04:45:00 crc kubenswrapper[5072]: I0228 04:45:00.147125 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0bcd1d52-5c86-45ce-b5cf-4a571832edb3-secret-volume\") pod \"collect-profiles-29537565-wsd7f\" (UID: \"0bcd1d52-5c86-45ce-b5cf-4a571832edb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537565-wsd7f" Feb 28 04:45:00 crc kubenswrapper[5072]: I0228 04:45:00.147177 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9qdj\" (UniqueName: \"kubernetes.io/projected/0bcd1d52-5c86-45ce-b5cf-4a571832edb3-kube-api-access-k9qdj\") pod \"collect-profiles-29537565-wsd7f\" (UID: \"0bcd1d52-5c86-45ce-b5cf-4a571832edb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537565-wsd7f" Feb 28 04:45:00 crc kubenswrapper[5072]: I0228 04:45:00.147205 5072 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0bcd1d52-5c86-45ce-b5cf-4a571832edb3-config-volume\") pod \"collect-profiles-29537565-wsd7f\" (UID: \"0bcd1d52-5c86-45ce-b5cf-4a571832edb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537565-wsd7f" Feb 28 04:45:00 crc kubenswrapper[5072]: I0228 04:45:00.147372 5072 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 28 04:45:00 crc kubenswrapper[5072]: I0228 04:45:00.151428 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537565-wsd7f"] Feb 28 04:45:00 crc kubenswrapper[5072]: I0228 04:45:00.248563 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0bcd1d52-5c86-45ce-b5cf-4a571832edb3-secret-volume\") pod \"collect-profiles-29537565-wsd7f\" (UID: \"0bcd1d52-5c86-45ce-b5cf-4a571832edb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537565-wsd7f" Feb 28 04:45:00 crc kubenswrapper[5072]: I0228 04:45:00.248614 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9qdj\" (UniqueName: \"kubernetes.io/projected/0bcd1d52-5c86-45ce-b5cf-4a571832edb3-kube-api-access-k9qdj\") pod \"collect-profiles-29537565-wsd7f\" (UID: \"0bcd1d52-5c86-45ce-b5cf-4a571832edb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537565-wsd7f" Feb 28 04:45:00 crc kubenswrapper[5072]: I0228 04:45:00.248659 5072 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0bcd1d52-5c86-45ce-b5cf-4a571832edb3-config-volume\") pod \"collect-profiles-29537565-wsd7f\" (UID: \"0bcd1d52-5c86-45ce-b5cf-4a571832edb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537565-wsd7f" Feb 28 04:45:00 crc kubenswrapper[5072]: I0228 04:45:00.249977 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0bcd1d52-5c86-45ce-b5cf-4a571832edb3-config-volume\") pod \"collect-profiles-29537565-wsd7f\" (UID: \"0bcd1d52-5c86-45ce-b5cf-4a571832edb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537565-wsd7f" Feb 28 04:45:00 crc kubenswrapper[5072]: I0228 04:45:00.255297 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0bcd1d52-5c86-45ce-b5cf-4a571832edb3-secret-volume\") pod \"collect-profiles-29537565-wsd7f\" (UID: \"0bcd1d52-5c86-45ce-b5cf-4a571832edb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537565-wsd7f" Feb 28 04:45:00 crc kubenswrapper[5072]: I0228 04:45:00.265743 5072 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9qdj\" (UniqueName: \"kubernetes.io/projected/0bcd1d52-5c86-45ce-b5cf-4a571832edb3-kube-api-access-k9qdj\") pod \"collect-profiles-29537565-wsd7f\" (UID: \"0bcd1d52-5c86-45ce-b5cf-4a571832edb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537565-wsd7f" Feb 28 04:45:00 crc kubenswrapper[5072]: I0228 04:45:00.466142 5072 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537565-wsd7f" Feb 28 04:45:00 crc kubenswrapper[5072]: I0228 04:45:00.851298 5072 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537565-wsd7f"] Feb 28 04:45:01 crc kubenswrapper[5072]: I0228 04:45:01.522558 5072 generic.go:334] "Generic (PLEG): container finished" podID="0bcd1d52-5c86-45ce-b5cf-4a571832edb3" containerID="007bcf516d8ab1b7890f198bcf4c643fd0caf79c291f146bf68977dc2b484227" exitCode=0 Feb 28 04:45:01 crc kubenswrapper[5072]: I0228 04:45:01.522745 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537565-wsd7f" event={"ID":"0bcd1d52-5c86-45ce-b5cf-4a571832edb3","Type":"ContainerDied","Data":"007bcf516d8ab1b7890f198bcf4c643fd0caf79c291f146bf68977dc2b484227"} Feb 28 04:45:01 crc kubenswrapper[5072]: I0228 04:45:01.522922 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537565-wsd7f" event={"ID":"0bcd1d52-5c86-45ce-b5cf-4a571832edb3","Type":"ContainerStarted","Data":"38dc4bd7e19ab3897933abe00b41bfbf554ef6940d8c06677761685108161e5e"} Feb 28 04:45:02 crc kubenswrapper[5072]: I0228 04:45:02.750353 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537565-wsd7f" Feb 28 04:45:02 crc kubenswrapper[5072]: I0228 04:45:02.936481 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0bcd1d52-5c86-45ce-b5cf-4a571832edb3-config-volume\") pod \"0bcd1d52-5c86-45ce-b5cf-4a571832edb3\" (UID: \"0bcd1d52-5c86-45ce-b5cf-4a571832edb3\") " Feb 28 04:45:02 crc kubenswrapper[5072]: I0228 04:45:02.936550 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0bcd1d52-5c86-45ce-b5cf-4a571832edb3-secret-volume\") pod \"0bcd1d52-5c86-45ce-b5cf-4a571832edb3\" (UID: \"0bcd1d52-5c86-45ce-b5cf-4a571832edb3\") " Feb 28 04:45:02 crc kubenswrapper[5072]: I0228 04:45:02.936579 5072 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9qdj\" (UniqueName: \"kubernetes.io/projected/0bcd1d52-5c86-45ce-b5cf-4a571832edb3-kube-api-access-k9qdj\") pod \"0bcd1d52-5c86-45ce-b5cf-4a571832edb3\" (UID: \"0bcd1d52-5c86-45ce-b5cf-4a571832edb3\") " Feb 28 04:45:02 crc kubenswrapper[5072]: I0228 04:45:02.937040 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bcd1d52-5c86-45ce-b5cf-4a571832edb3-config-volume" (OuterVolumeSpecName: "config-volume") pod "0bcd1d52-5c86-45ce-b5cf-4a571832edb3" (UID: "0bcd1d52-5c86-45ce-b5cf-4a571832edb3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:45:02 crc kubenswrapper[5072]: I0228 04:45:02.937448 5072 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0bcd1d52-5c86-45ce-b5cf-4a571832edb3-config-volume\") on node \"crc\" DevicePath \"\"" Feb 28 04:45:02 crc kubenswrapper[5072]: I0228 04:45:02.941571 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bcd1d52-5c86-45ce-b5cf-4a571832edb3-kube-api-access-k9qdj" (OuterVolumeSpecName: "kube-api-access-k9qdj") pod "0bcd1d52-5c86-45ce-b5cf-4a571832edb3" (UID: "0bcd1d52-5c86-45ce-b5cf-4a571832edb3"). InnerVolumeSpecName "kube-api-access-k9qdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:45:02 crc kubenswrapper[5072]: I0228 04:45:02.941762 5072 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bcd1d52-5c86-45ce-b5cf-4a571832edb3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0bcd1d52-5c86-45ce-b5cf-4a571832edb3" (UID: "0bcd1d52-5c86-45ce-b5cf-4a571832edb3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:45:03 crc kubenswrapper[5072]: I0228 04:45:03.148273 5072 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0bcd1d52-5c86-45ce-b5cf-4a571832edb3-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 28 04:45:03 crc kubenswrapper[5072]: I0228 04:45:03.148311 5072 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9qdj\" (UniqueName: \"kubernetes.io/projected/0bcd1d52-5c86-45ce-b5cf-4a571832edb3-kube-api-access-k9qdj\") on node \"crc\" DevicePath \"\"" Feb 28 04:45:03 crc kubenswrapper[5072]: I0228 04:45:03.535525 5072 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537565-wsd7f" event={"ID":"0bcd1d52-5c86-45ce-b5cf-4a571832edb3","Type":"ContainerDied","Data":"38dc4bd7e19ab3897933abe00b41bfbf554ef6940d8c06677761685108161e5e"} Feb 28 04:45:03 crc kubenswrapper[5072]: I0228 04:45:03.535571 5072 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38dc4bd7e19ab3897933abe00b41bfbf554ef6940d8c06677761685108161e5e" Feb 28 04:45:03 crc kubenswrapper[5072]: I0228 04:45:03.535595 5072 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537565-wsd7f" Feb 28 04:45:03 crc kubenswrapper[5072]: I0228 04:45:03.804720 5072 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537520-d72cz"] Feb 28 04:45:03 crc kubenswrapper[5072]: I0228 04:45:03.807848 5072 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537520-d72cz"] Feb 28 04:45:04 crc kubenswrapper[5072]: I0228 04:45:04.665981 5072 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd0cadda-3f93-43ac-b288-7e666a7f1b99" path="/var/lib/kubelet/pods/bd0cadda-3f93-43ac-b288-7e666a7f1b99/volumes" Feb 28 04:45:22 crc kubenswrapper[5072]: I0228 04:45:22.943601 5072 scope.go:117] "RemoveContainer" containerID="0452223c3b5f745fffaf5b0397b4c77817e783f3dbbd83c1b854cbf448faa7a1" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515150471371024451 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015150471372017367 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015150464724016515 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015150464724015465 5ustar corecore